Supplier-Experience.com is a magazine focusing on manufacturing supply chains
Follow
Follow us on social media
Get the latest articles to your LinkedIn feed
Go to LinkedIn

Data visibility and data quality control

Most of the discussions lately have been about clearing the current stack, defining the data core strategy, and building a data flow strategy. This is the right direction and also the only way to build the data control strategy and policies. In early 2010 the market was not mature as mature it’s today, and vendors had unique features that lead the customers to platform acquisitions and short-term choices.

Some of the companies are today fixing those issues in their digital transformation and cloud transformation projects. Since technology is developing so fast, companies have had to make decisions quickly or wait. At the same time, competitors are getting advantages in the market with new technologies.

Looking back on the decisions made 10 years ago

What would have changed if some of the technology decisions have been postponed for a decade. Probably the business side of things and delivery would have impacted a lot. Still, even if it wasn’t perfect, it was still running. People found the problems and have been fixing them. Even if they spent a decade fixing them the businesses make decisions to move forward with new solutions.

How to bring the legacy and learnings to the new tech world?

In the big picture, the biggest challenge for organizations has been the documentation of the learnings. Replicating software is easy. Replicating the learnings from the past decade with solutions is something that is more valuable than money. The individuals who have this information are the most valuable asset in future development projects.

Total
0
Shares