In ever-changing business environments, systems and databases must evolve to meet increased demands. Business growth inevitably leads to increasing database complexity and often necessitates infrastructure improvements. This new infrastructure requires data to be transitioned and reentered, leaving room for clerical mistakes. As a result, records struggle with inconsistency and redundancy between the various departmental databases. Ultimately, this means that organizations are not able to provide the highest level of service to their customers. Master data management (MDM) is a process that creates a master-level dataset of information, allowing interdivisional communication without the need for costly integration of division-level datasets.
Our client hired us to improve their database accuracy and efficiency. We started by creating reporting copies of each dataset. The next step was to create “golden records”—master customer identities that enabled division-level databases to identify unique customers, despite a multitude of duplicate stored identities.
To create the golden records from the various customer identities siloed within the different divisional databases, we extracted each of the ten divisional databases onto the reporting platform, then cleaned and profiled them. Next, we matched the various records using customized rules, fuzzy lookups, and machine learning procedures.
Mastering data from siloed databases allowed our client to clean the raw data, apply business rules to validate the data, and better track customer experiences across business divisions without integrating division-level datasets. The improved data management led to improved customer service. Using our master data management system, marketing and sales can now better address customer needs. Now, customer’s purchases are accurately tracked, and relevant information is sent to them regarding promotions or offers.
If you are not part of an IT company, odds are your content management system (CMS) relies at least partially on third-party applications. Third-party applications are not necessarily a bad thing—they provide many beneficial CMS services. But as the number of third-party applications increases, integration costs rise, and systems become fragmented. Companies must strike a careful balance between increasing capabilities (and costs) and maintaining a well-integrated system. We help our clients navigate the tradeoffs associated with incorporating new CMS functionality.
In order to centralize licensee store resources, our client asked us to build a licensee employee portal. Ideally, the client wanted to be able to push updates and information simultaneously to the main employee portal and the licensee employee portal. This ensured that our client was able to house all the information needed for the employees of the licensee stores, and at the time, this was all our client wanted.
Both portals naturally underwent incremental change as a result of their slightly different roles, but our client still needed the portals to perform the same functions. Because our client was using two entirely different portals, updates meant the same code had to be written twice. To increase the efficiency of the implementation, we generalized the logic that ran both portals.
Working in a production environment that extensively relied on third-party applications presented many unique challenges. Due to the size of our client’s SharePoint farms, for example, we could not load the entire production environment in lower testing environments. Even though we used proxies to reproduce errors, we could not be certain that bug fixes would be successful until they were implemented in the production environment. This challenge was then exacerbated by the size of our client. Our client is responsible for over 10,000 stores. This means that any proposed changes to the SharePoint farms must go through a lengthy vetting process of at least three weeks.
Our client is a business strategist in charge of his company’s brick-and-mortar and online stores. His company wanted to coordinate promotional campaigns with near real-time sales performance during the busy Thanksgiving shopping season. We’d worked with our client for many years, and this latest project offered our client and ourselves a real chance to excel. In 2016, we had helped him automate retail reporting, which allowed reports to be delivered every fifteen minutes. Now, our client was tasked with automating the reporting of all sales data—a collection of data that was 100 times larger than the reports we had automated the previous year. With just two months until Black Friday, we hurriedly set to work.
From the outset, we knew this project presented unique challenges. Each retail division had a different reporting system. The brick-and-mortar stores relied on custom on-premise databases. But due to the sheer volume of the data we needed to collect, we knew that we could not process and report data at this scale using the on-premise solutions. We would need to use a suite of Azure technologies.
These technologies, however, came with their own set of problems. At the time, Azure Analytic Services was not yet out of preview. Many of the features we needed lacked documentation. With these unknowns in mind, we conducted a series of proof of concepts over a period of three weeks. We then vetted our assumptions, communicated the results with our client, and determined our final architecture design.
Our work resulted in a tremendous triumph for our client. Despite the technical challenges we faced during the Thanksgiving festivities, there was no impact on end users. Each of the 125 email reports was sent to promotional staff on time. Our client made five promotional decisions based on the reporting that we did, and each decision resulted in improved revenue results. Our reports were widely used; the dashboard that collated the sales reporting data ranked 100 out of 75,000 dashboards used by the organization.