Competition in the retail industry is a given fact and something that retailers, no matter what size, face on a day-to-day basis. Major US companies like Kroger, Home Depot, Wal-Mart and in Europe Aldi Einkauf, Tesco and Carrefour use every tool at their disposal to maintain a competitive advantage. That could mean optimizing the efficiency of their supply chain to reduce stock out and overstock situations, getting a better understanding of who the customer is to maximize wallet share or even predicting what demand will be in months to come to take advantage of bulk buying opportunities.
For many retail organizations, looking to keep their advantage, utilizing Artificial Intelligence (AI) and Machine Learning (ML) has become the leading edge. Examples like tracking and predicting the spread of the flu virus, like Walgreens does, or making recommendations for makeup based on computer analyzed models of your face, like Sephora does, are just some of the many ways in which major retailers are using advanced technology stay ahead of the rest. These use cases might seem futuristic but, they are already a real part of the way in which Artificial Intelligence and Machine Learning impact and influence organizations and customers.
The interesting thing about all these use cases is the crucial need for data to make it a success. And while much has been said about the transformative power of data and its role in the 4th industrial revolution, it is also true that 98 percent of those surveyed believe preparation and aggregation of large datasets in a timely fashion is a major challenge. Anecdotally, it has been said that: “Using bad data for AI & ML projects, only helps organizations make poor decisions with more confidence!”. So, in short, when used successfully, data plays the lead role and is a catalyst for innovation providing a differentiator for organizations tapping into it.
But, tapping into data to leverage its power, needs to be done consciously. There are many complex, advanced technology considerations, lack of skills or the upskilling of resources to take advantage of these technologies and in many cases, the lack of required infrastructure and most importantly the time to make it all work together (many AI & ML projects take as long as 6 months to complete). Unfortunately, these challenges put the utilization of data to differentiate and innovate out of reach for all but the most mature organizations.
At TimeXtender, we love solving complex customer problems and in close cooperation with the Microsoft AI Engineering team, came up with a new approach. In short, the team of analysts and engineers from TimeXtender and Microsoft decided to put the customer at the center and to simplify the usage of advanced technology, through automation, enabling just about anyone to tap into the transformative power of data.
The result of this collaboration is a technology template (available in the Azure Marketplace), a retail sales data model (built integrated into the technology) and an end-to-end advanced forecasting model deployed through Azure Auto Machine Learning.
While this might sound like a complex mix of standalone items, the beauty of the solution lies in the simplicity. In as little as 2 hours, you can deploy a full enterprise grade technology and data architecture, populate a data model and receive detailed forecasts, using cutting edge technology.
In 3 simple steps,
Through a single Azure Marketplace template, you deploy all required services for the solution including Discovery Hub, Azure Data Lake Storage Gen2, Azure SQL, Azure Machine Learning Workspace, Azure Analysis Services, a Power BI dashboard. These services are preconfigured, connected and deployed in sequence, eliminating the need for any manual setup work.
Once deployed, use Discovery Hub to connect to your data sources and map fields to the pre-defined data structure used by the Microsoft machine learning service.
Execute the Discovery Hub project to load, prepare and cleanse the data and store it in an Azure SQL database. Then execute the pre-built Azure Jupyter Notebook (hosted in an Azure Notebook VM) to connect to the data, split the data for training and testing, and run AutoML to do the required feature engineering, algorithm selection, parameter tuning, model recommendation and selection and ultimately forecasting. The forecast is then integrated back into Azure SQL DB where is it made available via a Power BI dashboard.
This simple 3-step process deploys all the advanced technology needed along with the Discovery Hub to manage the data as well as the data model and dashboards to review and track the results. In addition, this environment can easily be extended to provide further insights and the Auto ML process can be tuned to provide more accurate results and outcomes. Learn more about this combined solution in this Microsoft Tech Community blog post.
In retail, tapping into the power of your data is something that can dramatically alter the success of the organization, but for most companies, it remains a pipedream. Using the innovative technology pairing of Discovery Hub and Azure Auto ML, this usage of data can be used to accelerate growth and success.