[A Look Inside] The iConnect Architecture

One of the guiding values of Icarus Connect is our desire to share knowledge. It is this guiding value that first sparked our flagship product, iConnect, allowing you to tap into your connected things and consume the knowledge that it gathers about your environment. Keeping with value, we also make it our goal to share our technical experience and some of our technical solutions with those outside our organization. Our platform was built by a really cool team of developers who sometimes found themselves relying on the experience and knowledge of the technical community to help solution and design our systems. Additionally, our team has gone through many cups of coffee (and cans of Red Bull) to find a way to do what no one else has done before, and we want to give back to the community that helped us when we couldn’t see that missing semi-colon or the misspelled variable name.

Today I want to talk a little bit about the architecture behind iConnect.  It has been a long path to get here and, while we are always evolving as a tech company to do better, I feel that we have really nailed down our core architecture.  While we probably haven’t pioneered Web 3.0 (does anyone remember when Web 2.0 was the big buzzword?), I feel that what we have provides additional validation to other companies who have similar architecture solutions and will help guide you when you need to make architectural decisions.  So what did we land on, and where did we start?

Back in the days of infancy, in a time when Icarus Connect was being birthed and we had yet to even decide on a name, our founding members were slaving away over a keyboard hammering out a cloud platform application in PHP and MySQL.  The application was a hybrid offering of B2B and B2C features that specifically targeted in-home connected device solutions.Our goal was to simplify the lives of our consumer customers, while providing valuable insights to our business customers around the home.  As we learned more about the market of IoT (Internet of Things), we realized that we needed to change to meet the needs.  Unfortunately, we found that we were head of the game by about 6 years.  While IoT in the home was becoming the popular thing there were fears around connected homes, barriers such as pricing and compatibility, and a lack of standardization within the vendor community.  We continued down our path, as we learned more and more with every line of code, but we kept erasing and re-drawing on the whiteboard as we continued to dial our vision back to present-day.

Eventually, we found that IoT had made a larger penetration in the B2B sphere.  Many businesses were using, whether they knew it or not, connected devices.  Some of these were dumb devices, they gathered data but didn’t really act on that data in a meaningful way, or smart devices in rare instances.  It was during this discovery of where IoT was that we realized two things:  1) Businesses were going to pave the evolutionary path of IoT and, ultimately, would bring it to the home/consumer; and 2) Businesses had all of this data that they either did not know about or did not know how to use, and were losing out on potential money saving opportunities.  Introduce a two-part pivot!

As we took a pivot in our business objective, we also saw the need to take a pivot in our technology architecture.  We needed to be able to handle mass amounts of data, work in a multi-threaded environment, and have a strong army of workers who could crunch through raw data and turn it into something meaningful.  We dumped our PHP code into a code vault for later product add-ons, and pulled out parts to reformat in C# code. We picked a combination of C# and PostgreSQL for it’s ability to handle the large flow of data.  Now we are making fire.  Our development team is pushing through the code, designing innovative solutions to make Microsoft behave with non-Microsoft software, and create that army of workers.  This solution lived across a Unix server and Microsoft Azure.  As we learned more about our target audience in this process, we began to realize one thing…IoT is EXPENSIVE!  As a platform provider we have a bottom line, our break even, that has to be built into the package price of each of our clients.  As a company with employees and a vision to grow, we had to be able to price our packages above that break even.  Introduce our second technology pivot.

Why did we decide to pivot, again?!  Well, as I said previously, IoT is expensive.  So expensive, in fact, that we wanted to bring an analytics solution to our customers that didn’t break the bank.  Our new question was, “how do we create a lightweight and robust solution that won’t cost our customers a fortune?”  What we finally settled on was a series of technologies sitting on top of the Heroku platform.  Heroku gives us the ability to scale as our needs scale, deploy to our customers in a controlled manner, and offers many out-of-the-box integration solutions.  With the foundation for Heroku in mind, we began to identify technologies.

In order to keep our code robust, lightweight and give it the ability to scale to handle growing work loads, we decided to use Ruby and the popular Rails framework (Ruby on Rails).  Not only was this an ideal solution to meet the primary needs, but it met our secondary need of converting from C# to a new language as quickly as possible.  This conversion also allowed us to offer lower prices by removing the need to build in Microsoft license and usage fees.  The integration and automation of our backend system to our business partners and API providers was quick and painless, allowing to propel even further ahead and start working on key components (such as our army of workers) ahead of timeline! Thanks to Ruby and Heroku, we were able to make use of two packages that really simplified life in this area:  Sidekiq and Sidekiq-Cron.  With these two packages we could not only offload jobs to background workers that were anxiously sitting by on dedicated Heroku dynos waiting to crunch through some hardcore raw data, but we could also schedule those jobs so that they ran on set intervals.  IoT produces data at the rate of seconds and minutes and, since not all IoT solutions have advanced far enough to offer data pushes, we had to be able to pull that data as fast as it was generated.  The powerful combination of these two packages, along with the worker  roles that Heroku can spin up on the fly, was the perfect solution.

We are now well along our timeline in development of our Alpha product, with a release of our Beta coming up faster than anticipated, and very happy with the solution we have come to.  Initial benchmarks of this solution against some of our key IoT partners have proven to be not only successful, but exceeding expectations.  Stay tuned as we continue to share more about our travels through the IoT landscape.