RBM Technologies, based in Cambridge (a suburb of Boston), Massachusetts, USA, provides sales floor planning SaaS (Software as a Service) to large companies with many consumer-serving locations, in the consumer retail/services space.
My contract with RBM started in February 2015 and was originally anticipated to run for about three to six months, via Toptal, and actually lasted about nine (stopping in November to take the US government contract). My main tasks were to make enhancements and bugfixes in two systems. All work was 100% remote.
The first system supports an API to answer queries sent in a JSON-based custom query language, and sends the results in JSON. It is intended for use by the second system, to reduce the time it takes to answer these queries from about half an hour (sounds crazy, but that's what they said!) to a couple seconds. (We were aiming for half a second, and most known queries quickly became that fast. However, a few took up to a couple seconds. We agreed that the extra effort to speed up the few slow ones would not be worthwhile.) For this one, I:
- Made it understand and answer many more types of queries, and in a "cleaner" and more extensible way. (It was composing a massive SQL string. I made it use ActiveRecord. We planned to move eventually to ARel, in order to get "or" capability. That was put on hold once the Rails team announced that ActiveRecord 5 would include that.)
- Extended its ability to store, modify, and delete objects in response to object-change notifications received over RabbitMQ. Previously it only stored certain columns, whose meanings were the same for all clients, and were therefore stored in normal database columns for this program. I made it store attributes whose meanings varied by client, and objects' memberships in custom hierarchies decided by the client. The custom ("ad-hoc") attributes were stored in PostgreSQL HSTOREs, and the hierarchies in a tree fashion using the Ancestry gem. Both were wrapped in additional layers to enable constraining a search to a particular timeframe.
- Added an "options" endpoint that returns the names and values for assorted options, including those ad-hoc attributes and custom hierarchies.
- Added the ability to download query results in CSV format.
- Added the ability to allow fixture types to "act as" another type, so that when the latter is queried, the former is included.
- Added the ability to allow the client to ask for "sets" of certain types of fixtures, by designating that fixtures of a given type "count as" some fraction of a full fixture. This also worked in reverse, allowing the client to designate that fixtures of a given type "count as" some number of complete fixtures.
- Made it use the other program's database directly, instead of its own summarized copy, negating the prior need for special treatment of client-varying attributes, custom groupings, receiving item updates over RabbitMQ, the "options" endpoint, and other complications.
- That in turn required modifying the multi-tenancy arrangement, from switching schemas to switching entire databases (albeit not servers), even though we were still using PostgreSQL, which does support separate schemas.
- Made it use the other program's tables directly, rather than a view as done previously, which was severely impacting performance, as proven when I:
- Wrote a program to create and benchmark queries done various different ways (such as with a view involving many tables versus joining tables only as needed), and used it to create benchmark spreadsheets.
- Unified the different kinds of queries into one, reducing redundancy in both the "production" codebase and the test suite.
- Updated Capistrano deployment process and files.
- Removed multi-tenancy, opting to have one installation per client, not sharing any more.
- Helped define several new syntax parts for the query language.
- Helped define new request authentication process.
- Fixed some bugs affecting development and testing.
- Fixed a bug preventing referrer-based authentication in cases where the tenant database name did not follow a strict assumption made by prior developers, which was impossible to follow in the test lab environment.
- Fixed a bug creating different cache keys for searches with identical parameters just mentioned in a different order.
- Made it an engine to be used inside the other program, including downgrading from Rails 4.2 to 3.0, resolving other conflicts, and packaging it as a gem.
The second one is their "flagship" application, under constant active development by a large-ish team. I worked on it while they had some decisions to make about the first one. This one is for planning floor layouts for chains of retail sales stores and other such consumer service centers, such as banks, big-box retailers, and cellphone stores. For this one, I:
- Made it filter a list of "planograms" (arrangements of content on fixtures) according to what category or more specific type of merchandise they're about, what sales campaign they were last updated in, and/or how that update was done.
- Added an icon and highlighting to planograms with unseen changes in that same list, and let users filter and sort by that status.
- Removed those markings when a user clicks on the planogram in the list, to load it into the editor, unless the user has certain "stealth" permissions.
- Added an approval checkbox to the planograms in that list, including making adding/removing approval also mark it as "seen", unless the user has certain "stealth" permissions.
- Changed which fields were imported, and some calculations from those fields, during CSV import processing.
- Made some reports exclude lines about locations outside of chosen campaign or chosen location filter (they were previously cluttered with all locations).
- Made ingestion of client data feed add some more data to the names of automatically created filters. Also made it more efficient, by creating filters for only the combinations of certain factors that were actually used in the current file, not all possible combinations.
- Found and fixed a bug removing too many options from a dynamic list matching only a string typed by the user.
- Found and analyzed (someone else then fixed) a bug in the search for sales content arrangements left over from prior sales campaigns.
- Found root cause of a bug preventing customer admins from performing certain actions; can't claim credit for the fix because it was a data problem, to be solved by other means.
- Fixed bug yielding incorrect counts of how many copies each location had of a given piece of content for a given sales campaign.
I also advised on matters of coding style, tools, and process, such as ticket granularity, ticket state definitions, and what to test in what way.
Technologies, techniques, tools, etc. I used there:
- Ruby on Rails 4.2 with Ruby 2.1, for the query-answerer
- Ruby on Rails 3.0 (yes, they knew it was horribly obsolete) with Ruby Enterprise Edition 1.8.7 (ditto) for one version of the sales floor planner and Ruby 2.1 for another
- Haml for views
- RSpec for testing
- ActiveModel::Serializer to customize the serialization format broadcast by the "main program" and read by the "query-answerer" (before we decided to use the main one's database directly)
- PostgreSQL 9.3, including HSTOREs and multiple schemas
- Pivotal Tracker
- RabbitMQ 3.4
- Redis 2.8
- Google Hangouts for daily standup meetings
- Initially Skype, then later Slack, for person-to-person IM plus chat rooms
- Apartment to switch schemas within a database, and later entire databases
- Capistrano for deployment