Category Archives: Technology

Posted on Thu, Mar 1, 2018 @ 7:26 am

The Oxford Dictionary defines augmented reality as “A technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view” (Oxford, 1).  It further goes on to say that augmented reality, or AR, is “a technology that works on computer vision based recognition algorithms to augment sound, video, graphics and other sensor-based inputs on real-world objects using the camera of your device” (Oxford, 1).  To put it more succinctly, the purpose of AR is to place graphics and even audio on top of a real-world environment in real time.  While this is fairly common knowledge, what most are unaware of is the fact that there are different types of AR:

  • Marker Based Augmented Reality – Uses a camera, typically one on your smartphone, and some type of visual marker, such as a sign, car, or some other real-world physical object, to produce a result.  When that marker is computed and deemed a marker of importance, specific content or information is shown over that particular marker.
  • Markerless Augmented Reality – Sometimes referred to as location based, this is one of the most commonly used applications of AR.  This form of AR uses GPS and compass technologies in your device, typically a smartphone, to determine your location and show you broad stroke information about where you are and what you are interested in.
  • Projection Based Augmented Reality – This form of AR actually projects an image or field of light onto a real-world surface, and then allows humans to interact with that projection, sense what they are doing with it, and react accordingly.  A perfect example of this is a projected keyboard
  • Superimposition Based Augmented Reality – This form of AR is related to Marker Based AR, however, it differs in one major way.  This type of AR takes a view and completely replaces the entire view with something different.  A good example of this form of AR would be an interior decorating application.  Take a picture of your living room, then drag and drop pieces of furniture from a catalog and place them in the room to see how things will look and fit.

How does it work?  

As you can start to see from the outline of what AR is above, there are really four major components of any AR system, Sensors and Cameras, Projection, Processing, and Reflection.  In order to get a good understanding of how AR actually works, we will briefly review each of these components.

Sensors and cameras are at the core of every AR solution since their main job is to gather information about what is going on around the user.  They are typically on the outside of a device and gather information about what is happening in the “real world”.  They then transfer that information onto a processor to be interpreted.  Many people only think of cameras in this part of AR, but sensors provide valuable information related to temperature, angle the device is being held, and even elevation of the user (perhaps in a 39-story building in NYC).  More often than not there will be more than one camera and sensor on an AR device as they all provide different data points to the processor.  Some are responsible for gathering depth information, others simply for image capture, and still others for video and other informational capture.  After the sensors and cameras gather everything they can about a user’s real-world surroundings, the data then needs to be processed.

Processors are the next major component of an AR system and are typically so powerful, they act like mini supercomputers in the palm of our hands.  When one hears the word “processor” they tend to think only about a CPU or central processing unit, but there are many other types of processors that come into play in an AR system.  Think about all of the components an AR system would need to function properly in addition to the CPU: GPU, RAM, flash memory, WiFi, GPS, Bluetooth, accelerometer, magnetometer, and even a gyroscope!  Each of these different processors plays a distinct and very specific role in interpreting the data coming into them from the cameras and sensors. Take one of them out of the picture and the user experience would be lacking.  After the processors do their thing to assimilate and interpret the data being sent to them by the cameras and sensors, it is time to produce an output to the user.  As mentioned above, the output can be either projected or reflected for the user to interact with it.

Projection components of an AR system are responsible for displaying the augmented reality pieces onto the real-world background of the user.  Projection-based AR works by utilizing a mini-projector typically facing forward on a wearable AR headset or something like it.  It essentially transforms any surface into an environment with which the user can interact.  Today the projection typically takes place on a screen in front of your eyes (think smartphone or tablet) but in the near future it is predicted that AR projectors will be powerful and intelligent enough to eliminate the need for a screen at all, making it possible for real-world surfaces to become part of the AR experience.

Reflection components are the second way that an AR environment can be created for the user.  These systems function a bit differently from projectors in that they use mirrors to focus and alter the way the user sees the information being presented.  Through a mix of light projection, different levels of reflective mirrors and screens the AR system can not only show the user certain things but read their input and interaction with those things as well.

Now that we have a working knowledge of what AR is and the components that enable it, we should look a bit deeper into practical applications of this technology in today’s world.

How is AR being used today?

There are many practical and useful applications of augmented reality in today’s world but let’s think about a few to get our minds going.  In business consider remote collaboration, and augmented office spaces, in manufacturing and e-commerce think repairs and product showcasing, and in the travel industry ponder the ways tours and maps can be enhanced.

In today’s business world our work environments are spanning the globe, teams are spread out across two and three continents making collaboration more important than ever and next to impossible to achieve.  Think about the teleconference, a phone meeting with 5 – 10 people who cannot see each other, cannot read body language and are constantly talking over one another.  What happens?  Most people become disengaged and start “multi-tasking” which we all know means they have completely stopped paying attention and contributing to the discussion.  By implementing components of AR into your meetings, you can bring people together in ways a simple teleconference system cannot.  Now, switch your mind to thinking about office spaces and people who are located in the same place.  Currently, in many office buildings, there are meeting and team spaces that are tailored and built for specific purposes.  In the very near future imagine generic meeting spaces with no specific features and a few plain, load-bearing objects, like tables and chairs.  Then imagine AR projectors and reflectors creating a project room specific to the needs of that particular team and meeting objective.  Need 4 walls of whiteboards?  No problem.  Need to present your project timeline on one wall, and an outline of objectives on another and interact with them both at the same time?  No problem.  AR can easily provide solutions to both the communication and workspace problems we all experience today.

When thinking about the manufacturing industry, they have some unique challenges of their own around repairs and training of employees to fix a wide range of increasingly complex products quickly and completely to ensure customer satisfaction.  AR overlays can assist a technician in diagnosing and fixing a problem in real-time.  Consider an engineer repairing a jet engine: an AR application enabling them to see an overlay of a piece of machinery, with repair information on the side, temperature sensor readings, next steps, clear directions on which hose to disconnect, etc. could reduce overall cycle time for repairs, provide on the job training, and increase the chances that the repair will be done correctly the first time.

In the growing eCommerce industry consumers are purchasing products they can’t hold, touch, or see for themselves, which creates angst with some and fear to purchase in others.  It can also create a high volume of returns, with dissatisfied consumers returning their goods because it wasn’t what they thought they were purchasing.  With the introduction of AR, consumers can now virtually see and manipulate the things they wish to purchase in great detail before they click the “checkout” button.

The last real-world example of how AR is being used today is in the travel industry.  The last time you took a vacation and went on a tour, think about what you did.  Most likely you were one of many people in a large group and either had a tour guide or a physical map to guide you through the sites.  With AR, travelers are able to get all the information available about a building, painting, house, or other landmark simply by looking at it.  Information about that object will be displayed on the device with which they are using to look.

With our wide and in-depth experience in developing mobile applications powered by frameworks like ARKit for iOS and ARCore for Android, OFS can build AR applications for you.  Contact us here to set up a time to talk with us about your questions, ideas, and interest in implementing AR in your apps.

Works Cited:

1.     (Oxford, 1) – https://en.oxforddictionaries.com/definition/augmented_reality

Ganeshram Ramamurthy is ObjectFrontier’s technical director and heads technology for presales. For many years, Ganesh has been designing and developing enterprise applications across various domains. He has a keen interest in emerging technologies and is now spearheading blockchain initiatives at OFS.

 

 

 

Posted on Mon, Nov 20, 2017 @ 4:00 pm

You have a cloud solution. Congratulations! Now, how do you sell it?

After you built your cloud solution, your next key decision is choosing the right go-to-market (GTM) strategy. There are many factors you’ll need to consider here, including subscriptions, pricing, and sales channel.

When crafting the GTM strategy for a cloud-based solution, you need to look at the subscriptions or offerings you will provide to your customers. Remember that customers like having options from which to choose, but too many options actually can deter them from making a decision. That’s why most cloud solution providers settle on three or four subscription options.  How these subscriptions are structured depends on the solution provided. Some solutions lend themselves to a feature-based subscription model, e.g., the Silver subscription has 10 features, the Gold has 20, and the Platinum has all of them. On the other hand, some solutions lend themselves to a usage-based subscription model, e.g., the Silver subscription provides up to 1,000 transactions a month, the Gold provides 5,000, and the Platinum provides 10,000.  Defining usage-based subscriptions should be done with operating and transactional costs in mind. Defining feature-based subscriptions can be a bit more confusing.

When considering a feature-based subscription model (one of the most common), an important thing to keep in mind is this question: What are the fire breaks between each subscription level?  Why would a customer want to purchase a higher cost subscription level rather than a lower cost one?  What are the “carrots” you can use to entice a customer to move up to a higher subscription? There are many ways to think about this, but there are often two major lines of thinking. One is to place entire features in only one subscription level, e.g., reporting is only in the Platinum subscription level. The second way to think about this is to spread out a feature over multiple subscriptions, e.g., canned reporting is available in the Silver package, basic report configuration in the Gold, and customized report wizards in the Platinum. The idea here is to whet their appetites with a feature in a lower subscription, knowing that once they see it they will want more, when they ask for more you have the perfect answer…upgrade to the next subscription level, and you can have it.

So, now that you have your subscription levels defined, you have to figure out how to price each level. Pricing is an art and has many components to it in order to get it right. Some factors to consider when defining pricing for a cloud-based solution are the length of your costs, license term, revenue recognition and generally accepted accounting principles (GAAP), and the value of building a stream revenue base.

When thinking about pricing, the first thing to consider is your cost. Building, delivering and hosting a cloud-based solution is free for your customers, but certainly not for you. One of the most difficult things to understand is how much it will cost to host and maintain a new cloud-based solution. Work with your cloud provider to estimate compute and storage cost. Doing so on a “per transaction” basis often can help. Then, you can estimate how many transactions each subscription level might generate per month, and you can include this in your pricing model. The next step is to think about the license term. Most smaller, less expensive solutions choose a month-to-month subscription, which give the customer the option to exit every 30 days without penalty. For larger solutions with higher costs for onboarding a customer, some will choose a quarterly or even annual license model. It all depends on the investment made in onboarding on both sides of the table.

One of the most important things to consider when defining the pricing model for a new cloud-based solution is GAAP and revenue recognition. Many companies trying to make the move to providing on-premise monolithic software solutions are used to recognizing all of the revenue from that sale up front. However, revenue can only be recognized on a monthly basis when you’re selling a cloud-based solution, regardless of how you price it. Revenue is based on how your product is consumed. This changes the revenue recognition model, which can affect any factor within an organization, such as profitability, sales channel, and sales model to name a few.  An organization launching a cloud-based solution has to be prepared to build a stream revenue model business instead of an up-front, license revenue model business.

Cloud-based solutions are often billed monthly or quarterly (depending on the size and complexity of the solution), and because of this, they typically have a lower billable price point.  Due to low initial revenue streams, it often does not make financial sense to use traditional sales channels to sell these solutions. It is difficult and often cost prohibitive to pay a sales rep commission on a monthly service. Consequently, the price point and resulting compensation is so low that most field sales reps won’t even want to sell the solution. Because of this, most companies that offer cloud-based solutions use a sales channel other than field sales. Some will use inside sales reps and others will use digital sales channels, using products like Salesforce’s Marketing Cloud to market and sell online solutions to customers. While the GTM channel is an important choice to make, regardless of the outcome of that choice, enabling the customer to see and use the solution before they purchase it is a key component to success.

Free trials are key to a successful launch of any cloud-based solution. For one thing, people have come to expect a free trial, so not having one will be a major strike one in any cloud solution launch. Secondly, customers have a need to see the solution before they purchase it, so they can understand how it works and ensure the solution fits their needs. Remember, cloud-based solutions typically have no customization (custom coding) and minimal configuration (settings). Enabling a free trial will allow your customers to see the value your solution can bring out of the box. A few notes to ensure your free trial is a successful one:

  1. Full Features – Regardless of the subscription model chosen above, the free trial should ideally contain the full feature set and no usage restrictions to enable the customer to get the full experience of the solution.
  2. Nurturing Programs – Many forget that a free trial is great, but it is just that: Free. What often happens with free stuff? It’s free, so it has little to no value. Users often forget about free trials and don’t use them. Nurturing programs are essential to converting as many free trial users to paying customers as possible. Here are some major components for a successful nurturing program:
    • Welcome emails
    • “Did you know?” emails, in which features and use cases are outlined
    • “We noticed you haven’t logged in. Can we help?” emails

Nurturing programs should not stop after the free trial is over and should continue after a purchasing decision is made. Never forget that you could potentially lose every customer every month in the cloud model. Because of this, you need to keep in touch with your customers to ensure they know you are there. Do you have a new feature? Did you just learn about a new development or trend in the market? Let your customers know about these important updates, because this shows them you are a partner and not just a solutions provider.

We hope you enjoyed our 3-part blog series on cloud solutions! If you didn’t catch the first two in the series, click here to get started. If you would like to set up a time to learn more about how we can help you build an effective cloud solution, contact us here.

About the Author

Abdul Rafay Mansoor is a technical architect at ObjectFrontier, Inc., and his work primarily involves presales consulting. Abdul has been a developer for more than a decade, and he began taking on presales consulting roles a few years ago. Abdul’s area of interest is cloud native development, and you often will find him passionately advocating cloud adoption to our clients.

Posted on Thu, Nov 9, 2017 @ 12:35 pm

So, you’ve decided to build a cloud-based solution, but where do you go from here? There are a few key things that should be thought through, realized, and decided upon even before you start building your cloud solution. They revolve around what a customer of a cloud-based solution is looking for (hint, they aren’t looking for hundreds of features), which cloud service provider to choose, and why. Technically, cloud solutions are really Software as a Service (SaaS), so keep this in mind as we further explore and define cloud solutions.

Customer Expectations. As we discussed in our previous blog, features and functionality will not ensure you are a winner in the cloud space. Instead, they are table stakes. Without feature parity with your competition, your solution will not even be considered. Be confident in the knowledge that your competition has the same features as you. They know what your solution has, because they have an account on your system and have been using it since it’s been launched. Yup, they’re watching every move you make. Instead of features being the main leverage used to create differentiation, customers are defining a new paradigm.

In order of importance, the most common components customers say are essential for a cloud-based solution (in addition to security) are ease of use, service, support, scalability, performance and availability. More important than having hundreds of features is having usability, support and performance. The second group of components customers look for includes out-of-the-box integrations, insightful analytics, simple reporting, and lastly, a robust feature set.

Interestingly enough, people expect to use a cloud-based solution with no training, get the solution up and running in 10 minutes, and have easy access to helpful, on-demand support. OFS recommends you achieve these goals through being involved in implementation from the beginning and using tutorials, context-specific self-help systems that utilize videos, and chatbots to provide customers with self-service help on demand. For more information on how chatbots can help you provide excellent service to your customers, please see our blog series on chatbots here.

Cloud Service Providers. One of your most impactful decisions in this process is choosing which cloud service provider should host your new solution. There are many from which you can choose, and here are a few top examples:

  • Amazon Web Services (AWS) is the leader in cloud computing, with many services—including many fully managed services—and lots of community. However, a lot of AWS services seem to be going to vendor lock-in.
  • Microsoft Azure Cloud Services is considered the next leading cloud services provider after AWS. Azure is well suited to the Windows\.NET client base. Azure has adopted open source in big data, but every service coming from the traditional Microsoft stack (SQLServer, etc.) is going to be a vendor lock-in.
  • Google Cloud Platform (GCP) does not offer as many services and as much community support as AWS or Azure yet, but GCP is differentiating itself by not using vendor lock-in. Instead, this provider uses more open-source technologies.

Each cloud service provider has its own platform with its own APIs, management and reporting consoles, and technology stack. This is how these providers can offer differentiated services to their customers. There are so many factors that go into selecting a cloud solution that some businesses are not afraid to pick a multi-vendor configuration.

Whether you’re interested in using just one vendor, or you think a multi-vendor configuration is right for you, here are the most important factors to consider in your decision:

  1. How many services does this provider offer, and how many of them are fully managed?
  2. What is the community support like for this platform?
  3. Will I experience vendor lock-in issues with this service provider?
  4. What is the availability, durability and performance offered in this service provider’s SLA (9s)?
  5. Will I need to be aware of conflicts of interest if I choose this service provider?
  6. Is this service provider compliant with industry standards like PCI and HIPAA?
  7. What is the cost structure for this provider?

All the factors above will influence your choice, but the major advantage of cloud is the pay-as-you-go model: You don’t really commit to anything, so you can start experimenting with any or all platforms and mix and match, too. In addition, the new container-based architecture introduces a lot of flexibility.

To learn how you can effectively market and sell your cloud solution, stay tuned for our final blog in this series, “What are Cloud Solutions Anyway? Part 3,” coming next week. What are some of your observations about working with cloud solutions? Do you have any additional suggestions for what to consider when choosing a cloud service provider? Are you planning to implement a cloud solution for your business? Contact us here to talk with one of OFS’s tech experts, or leave us a comment to start the discussion!

About the Author

Abdul Rafay Mansoor is a technical architect at ObjectFrontier, Inc., and his work primarily involves presales consulting. Abdul has been a developer for more than a decade, and he began taking on presales consulting roles a few years ago. Abdul’s area of interest is cloud native development, and you often will find him passionately advocating cloud adoption to our clients.

 

Posted on Thu, Nov 2, 2017 @ 7:51 am

SaaS. Public Clouds. Private Clouds. If you’re in the software industry, chances are you hear these terms pretty much every day. From your boss to your colleagues to the software industry gurus you follow on Twitter, cloud solutions are what everyone is talking about right now. But with so much noise out there, you may be asking these questions:

“What exactly are cloud solutions?”

“How relevant are they to my industry?”

“Why should I build my next solution in the cloud?”

These are very good questions we hear from many of our clients, too. As we move into an era of solutions that include Google Docs, Office 365, Mint.com and many others, it’s important to understand what these cloud solutions are and what they aren’t, as well as how customers interact with them. In this blog series, we not only want to answer your questions, but we also want to give you ideas for how you can build successful cloud solutions that speak to your customers’ needs.

To ensure we’re on the same page as you are when you read this series, it’s a good idea to start with a clear working definition of the cloud. The cloud is nothing new. In fact, the cloud has been around since ARPAnet first linked two computers together in 1969. Up until recently, the cloud consisted mostly of web and file servers hosted on different Internet-connected networks and really didn’t have too much more to offer. It remained “the Internet” for years. However, once companies started to build and deliver applications and complete solutions on the Internet, they decided to rename the Internet as “the cloud” to breathe new life into the same infrastructure. The Internet was a network of connected computers and web/file servers, and now it is a network of connected cloud solutions and cloud service providers.

While cloud solutions are very different from each other, they typically have a few common characteristics that define them as such. Cloud solutions often offer benefits such as instant provisioning of new customers and users. This requires scalability, which is provided by virtualized resources with the ability to expand and contract servers and compute power on the fly based on need. First, a cloud solution is typically deployed (installed/configured) on a group of servers hosted and maintained by a third party. With this model, the customer’s main benefit is that his or her IT department typically is not involved in the installation, configuration, or maintenance of the solution. Believe it or not, since budgets continue to shrink and more IT departments are outsourced, many IT departments are advocating for cloud-based solutions because they don’t have to be involved as much over the lifecycle of the solution. Overall, the total cost of ownership of a cloud-based solution is often much less than purchasing and maintaining a software solution in house. Secondly–and this might go without saying–cloud-based solutions are almost always accessed by people using a web browser through the Internet using standard web ports (e.g., 80 and 443) with little to no software installed locally to make the solution work. A good cloud solution will work from any network on any computer that has a modern web browser. Lastly, most cloud-based solutions are multi-tenant to achieve economies of scale for the provider.

There are two ways you can architect a multi-tenant solution, and it is extremely important to understand the difference between them. The first way to implement a multi-tenant solution is cheaper but less secure. You would have a single database that contains all data from all customers in the same tables. Naturally, this is easier to build and cheaper to run. However, this makes a lot of customers nervous because their data is commingled with their competitors’ data in the same table. The only thing separating one customer’s data from another is a customer ID field on each record. The second way to implement a multi-tenant solution is more expensive but also much more secure: Have a unique database for each customer who uses the solution.  Naturally, with one DB per customer, there is no commingling of data. Each DB can be encrypted with its own unique encryption key, and there is almost no chance one customer can gain access to another’s data.

Now that we have working definitions of the cloud and cloud-based solutions, let’s talk about the words public clouds, private clouds, and SaaS. The servers and data in a public cloud are hosted on a provider’s network intended for multiple customers to connect from any place with an Internet connection. In a private cloud, a solution and all of its data is hosted within the firewall of a single customer’s network and is only accessible by that one customer’s users.  Software-as-a-Service (SaaS) is nothing more than a deployment and a model (and sometimes a monetization method) and basically means that users access the hosted solution on demand, as they need it, with little to no installation required. SaaS is the opposite of an on-premise solution. Knowing this, you can see how certain industries would be pulled toward cloud solutions.

If you think about why customers are entertaining cloud-based solutions, it will become clear why most industries today are moving towards them. Costs are a driving factor for every business as they are all going up, like costs for personnel, health care, raw materials, and IT. At the same time, customers are demanding things to be built faster, delivered more quickly (increased costs again) and at a lower price. As a result of this and the current economic climate, most companies are seeing their department capital budgets going down, which means they have less money to invest in costly on-premise solutions. With that in mind, it is no surprise that cloud-based solutions are most relevant in industries that are being forced to operate on leaner budgets, such as higher education, facilities management, healthcare, and legal. Based on a quick review of publically available RFPs, you will find that even the federal government is moving to cloud-based solutions for some of its needs. These industries are moving to cloud-based solutions due to the tremendous cost savings as well as the other benefits they provide.

The benefits of cloud-based solutions are rather substantial for both the customer and the solution provider. Let’s look at why cloud solutions are important from both sides of this table.

The Customer. More customers are demanding services in the cloud for a number of reasons. First of all, many are becoming more cost-sensitive about large purchases and are moving from a CAP-EX operating model to an OP-EX model, meaning they don’t want to commit to purchasing a system for $50-$100k up front. Instead, these customers desire a low monthly payment with no commitment and quick onboarding. Secondly, internal IT costs are rising. We have been hearing from our customers that internal IT department chargebacks for hosting a new application are often just as much, or more, than the license of the application. Lastly, on the same point, IT departments themselves are asking for more cloud-based services since they are running “leaner” than they have in the past.

The Solution Provider. The number one reason you should be thinking about building your next solution in the cloud is simple: Your competition is already doing it. If you take a step back and think about it, the reasons become clear. In a world of cloud-based applications and agile development, you can provide new features and defect fixes every day if you want to, but in reality, the release schedule usually is once every two or three weeks. The point is when there are new features to be rolled out, all you have to do is update a single cluster of servers in a cloud instead of rolling out an update to each of your customers, getting it installed, and dealing with the support fallout of incorrectly applied upgrades and patches. Secondly, when all of your customers log into the same multi-tenant environment, you are able to accumulate all of their usage data in one place. Check out our blog on machine learning and data lakes to understand the benefits.

When all of your customer usage data is stored in one place, you can…

  • Understand usage patterns and see what features customers are and are not using
  • Perform A/B testing of new features/ideas in a real environment
  • Directly market and sell additional functionality (upsell) to specific sets of customers with specific usage patterns. For example, you can say, “We noticed you use feature X. Do you know that if you upgrade to the premium package, feature X is expanded in a way we think would be useful to you?”

Lastly, here’s what we consider the real value of a cloud-based solution: analytics. With cloud solutions and their no-commitment monthly payments comes the reality that you could lose every customer each month. They simply are not locked into any commitment, which is one of the main selling points to the customer in the first place. As a software industry, how have we retained customers in the on-premise world? Implement and release new features before the competition does. With cloud-based solutions, your competition can do that just as quickly as you. So, what is the new way to provide differentiated value to your customers? It’s through using insightful analytics and reporting.

To learn how you can select the right provider to help you build your cloud solution, stay tuned for next week’s blog, “What are Cloud Solutions Anyway? Part 2.” Or, contact us here to set up a time to talk with us about your questions and interest in implementing cloud solutions.

About the Author

Abdul Rafay Mansoor is a technical architect at ObjectFrontier, Inc., and his work primarily involves presales consulting. Abdul has been a developer for more than a decade, and he began taking on presales consulting roles a few years ago. Abdul’s area of interest is cloud native development, and you often will find him passionately advocating cloud adoption to our clients.

Posted on Thu, Sep 28, 2017 @ 1:19 pm

Big Data. Data Lakes. Analytics. Machine Learning.  Are these tools only meant for tech giants like Google or Facebook? Or, can these tools meaningfully assist any business and help it respond to the ever-changing needs of the market? As the software industry continues to move toward consuming and delivering cloud-based solutions, many companies now realize they sit on a gold mine of data. Although many understand the value of data, few understand how to unlock that value.

Every time one of your customers clicks on a link, completes a transaction, or views a page, it can be logged and tracked.  Some customers realize the value of this data and set up a data lake to store it. To put it simply, a data lake is a database that contains huge amounts of raw transactional data. Data lakes often get confused with data warehouses because they are similar, but data lakes are much more versatile and can grow much larger than data warehouses. With SaaS or connected on-premise solutions, data can be exported on a regular basis from the solution to a data lake, where the data is normalized. This enables you to apply analytics at a later date. If you need help choosing the right data lake, we suggest Hortonworks Data Platform. (Full disclosure: OFS partners with Hortonworks.)

Data lakes collect endless amounts of raw transactional data, but it is just that: raw data. Without applying analytics to that data, obtaining any useful information or action items from it is very difficult.  Everyone knows what analytics is, but most only scratch the surface of what analytics can do for their business. For example, how many people look at a data lake and ask, “How many people logged in?” or “How many people clicked on a link after we deployed this new product/feature?” or “How many widgets were purchased after this marketing campaign?” It’s good to have this information when answering one-off tactical questions, but it does not tell a story about how your business is doing. It does not provide insights into your customers’ and users’ activities and trends.

What makes things more difficult is analyzing data from disparate systems. Think about a typical e-commerce shop. It has multiple, different backend systems all connected to create a complete solution:

  • Customer-facing web portals
  • Shopping carts
  • Billing systems
  • ERP/OMS/WMS/other order and fulfillment systems
  • Shipping systems
  • Sales/contact management systems

Each one of these systems comes from a different provider, has its own database, methods, naming conventions and APIs. The data lake ingests all data from each system, then uses the data for various analytics.

It’s all too often the analytics applied to a data lake do not realize the true potential of the data, or the analytics provide information that is just plain inaccurate.  How many have seen this scenario? Data scientists look at pre-collected data chosen arbitrarily by an engineer, and then they show that data with charts or graphs on a web page. How many dashboards contain the same data:  new accounts this month, number of logins, and number of widgets purchased? Typically, this type of data is used to prove or disprove an existing theory by a single person or team. These solutions provide no conclusions, next steps, or trends to watch out for, and most importantly, they reveal little to no insight into overall business trends.

The industry is moving in another direction and putting a new layer on top of an analytics engine: machine learning (ML). Data scientists now use artificial intelligence engines that meticulously sift through each raw transaction in a data lake to look for crucial trends that lie either at the surface or deep within the data.  These engines are capable of gathering, comparing, and analyzing data from multiple, completely different sources, such as the ones mentioned above.  These machine learning engines consume that data, and ultimately deliver suggestions, theories, and trends that provide answers to significant questions you didn’t even think of asking. Using artificial intelligence (AI) algorithms, an ML engine even can alter its analysis and conclusions based on the changing trends it sees within a data set.

We all think about sales and marketing organizations applying ML to their data to ensure they pitch the right products at the right time to the right consumer. However, many industries now use machine learning and AI to solve very specific problems. The financial services industry applies ML to prevent fraud and reduce expenses related to it. Investment and stock brokerage firms also use AI to suggest stock trends and offer insights to traders about when to enter and exit certain holdings. The healthcare industry has seen an explosion of data collection from wearable devices and is using ML to provide accurate, more targeted healthcare services to specific individuals. Even the oil and gas industry is applying ML and AI to data sets collected from mineral analysis to predict refinery failures or service degradations before they happen.

Ultimately, each industry and business strives to accomplish two things: Identify profitable opportunities for growth, and reduce or avoid risk. Humans ask big data teams to display data they “feel” is important to make educated business decisions.  However, this leads to missing significant trends and insights living deep within the data, and even glaring trends staring data analysts right in the face.  By implementing machine learning on top of data lakes and existing analytics engines, you can gain insights from transactional data to help your business grow.

Interested in seeing machine learning and data analytics technologies in action? Check out ObjectFrontier’s iHealth application demo from our Analytics Innovation Lab. iHealth is powered by data analytics and machine learning to provide real-time insights that create better opportunities to treat patients experiencing abnormal heart rates during physical activity.

About the Author

bob-kramichBob Kramich leads all our US sales efforts and is responsible for directing all business development activities for OFS globally. Bob has more than 25 years of experience in creating and delivering high-value software engineering relationships with US and international companies. Prior to joining OFS, Bob served as vice president of Business Development, Life Sciences, for EPAM Systems, Inc. (NYSE:EPAM), a leading global provider of software product development services. Bob joined EPAM through EPAM’s acquisition of GGA Software Services, the world’s preeminent provider of scientific informatics services to global biotech and pharmaceutical companies. Bob served as GGA’s chief business development officer. Bob holds a Bachelor of Arts degree from Tufts University and a Master of Business Administration degree from Boston College’s Carroll School of Management.

Posted on Wed, Apr 13, 2016 @ 6:46 am

I’m an old ­­software warhorse and wrote my first program back in 1972. Everything certainly has changed since then, but some principles endure. For example, one of my early bosses used to remark that good software was like a good axe a lumberjack used for years. Yes, the head had been changed several times and the handle was replaced a few times too, but somehow, it was still the same axe. His point was that good code should be designed and separated into components, so that as one part wears out, it can be replaced without throwing away the entire code base. Even when all the components have been replaced over time, somehow the product is still the same.

That concept of separate components with well-defined interfaces is particularly relevant in today’s digital business. As we all rush headlong into the process of digital transformation, it’s important that we don’t get so wrapped up in the latest mobile, cloud, and IoT technologies that we forget the basic notion that it will all change again. (And, like a metaphorical Yoda in the software world, I’ve lived long enough to see it change many, many times!) We can prepare for that change by moving to an API model that encapsulates our important business logic, which doesn’t change as often, from the ever-changing ways we use that logic to drive our business.

You often hear about the API economy these days, and as a veteran from the early days, I think it’s great to see us reach that nirvana of re-use we had long hoped for years ago. You no longer need to know the details of where data is stored, how it is accessed, or all the rules pertaining to it. You just call the component with an agreed-to format (API) and you instantly get the data you need to incorporate into your own program’s needs. This has given cloud-based software a tremendous boost that allows us to quickly build new software that stands on the shoulders of software already written and tested by someone else.

That same concept applies to our internal business software. If we can componentize our business rules and database access and create a set of well-defined API calls to handle things like adding a customer, calculating a payment, or giving out the current inventory level of a specific product, we are setting up our own building blocks that enable us to assemble them in new and important ways during our digital transformation efforts. Put another way, if we expect to fully participate in the API economy with other businesses, we must build our own APIs for ourselves first.

For example, APIs that record and monitor a car rental can be used by the rental agency’s website for booking purposes, by the renter’s expense management system to get the receipt and charges, and by the car’s manufacturer to provide usage information for warranty coverage.

While all this would certainly help the customer on his journey and streamline the car rental agency’s processing, it requires a lot of effort. The agency first must simplify its backend systems by getting rid of duplicate systems of record that store rental information inherited from prior acquisitions.  However, it takes years to decommission old systems of record. In the meantime, it may be necessary to design and build lower-level APIs around each of the duplicate systems and then build higher-level APIs to hide the fact that some of the data is actually stored in different systems. Neither the renter, nor his expense management system, nor the car manufacturer care one bit about which internal system the agency is using to record the transactions they need. You need to have your APIs mask all that complexity so you can offer a single view into your company and a single way to do business. This is the essence of what APIs offer to both your external customers and your own developers.

While it might be nice to try to build APIs for all your corporate data, the reality is that even if you could do it, the time and cost required would be prohibitive. You have to focus on building your APIs over time, embedding the work to do them in each of the short-term projects that our business requires. This requires some real discipline, as it always takes longer to build some APIs first than it would to just bang out the code asked for by the business.

Also, it always seems that your own staff, who are building new APIs, are the last ones to use them. However, by incenting them through carrots and sticks (like authorship awards for new APIs or penalties for not using them), you’ll create the ability to change the axehead and handle of your systems while keeping the essence of them the same.