"Master data! Master data! My supply chain for master data!"
With data quality and consistency becoming critically important factors in supply chain performance, companies will have to pay more attention to master data management. That may require supply chain managers to change the way they think about and utilize data.
"A horse! A horse! My kingdom for a horse!" screams King Richard III in Shakespeare's play of that name. At that point in the play, Richard, unhorsed and fighting on foot, is put at a disadvantage on the field of battle at Bosworth; as a result, he is killed by Richmond, who then succeeds to the throne as Henry VII. The point I am making here and with the title of this article is that the availability of a critical resource (like a horse, in Richard's case, or master data, for a supply chain) can be crucial for success and even for survival.
We at Gartner define master data management (MDM) as a technology-enabled business discipline in which business and information technology (IT) must work together to ensure the uniformity, accuracy, stewardship, semantic consistency, and accountability of an enterprise's official, shared master data assets. Supply chain performance is dependent on consistent definitions of customers, products, items, locations, and other master data objects. When data is poorly governed and inconsistent, supply chains become less competitive because more time and money is spent on managing information between systems and trading partners, and less is available for innovation. Good data leads to efficient supply chains, allowing resources to be spent on innovation rather than on coping with problems.
Master data has always been necessary, but the importance of its consistency in supply chains is growing. There are three main reasons for this. First, supply chain performance is coming under an increasing number of pressures. These include global and local competition; legal and regulatory demands; and social responsibility-minded shareholders, to name just a few of many possible examples. Underlying them all are today's fragile economic conditions.
Second, there is a growing emphasis among many organizations on knowing their customers' needs. More than this, organizations are seeking to influence the behavior of customers and prospects, guiding customers' purchasing decisions toward their own products and services and away from those of competitors. This change in focus is leading to a greater demand for and reliance on consistent data. For any supply chain leader, the path to meeting those demands leads back to master data.
And third, given the level of attention that IT is placing on data consistency, as well as companies' growing focus on collaboration with trading partners and their need to improve business outcomes, data consistency—especially between trading partners—is increasingly a prerequisite for improved and competitive supply chain performance.
As data quality and consistency become increasingly important factors in supply chain performance, companies that want to catch up with the innovators will have to pay closer attention to master data management. That may require supply chain managers to change the way they think about and utilize data. With that in mind, here are four topics that should be on the joint information management and supply chain agenda for 2013.
1. Business outcomes trump data quality
I am playing with words here. Of course data quality is important. But how important should it be? That is, how much cost should you incur to improve data quality, and what business value will you realize from that effort? Programs like master data management are most successful when they have a clear line of sight to a specific and measurable business outcome. By contrast, organizations seem to struggle when they focus on data quality and metrics related to the data itself as measures of success.
An example of a "bad" (ineffective) MDM metric would be "the number of de-duplicated records per month." This is of little interest to the user of business information, and it does not help the business understand why changing the way it uses information would improve outcomes. An example of a "good" (effective) MDM metric would be "net new revenue per first six weeks of new product introduction." This information will be relevant to the business user—the word "revenue" will make sure of that. Moreover, there is a specific time frame; the metric is bounded so that it can be measured. The number of de-duped records is not irrelevant, as de-duping would improve the quality of the data being used. But adopting the "net new revenue" metric will, rightly, keep the focus on the relationships between various activities and the outcomes of the work taking place, rather than on the data itself.
2. Information governance: Less about control and more about information value
Organizations are making progress with master data management and other information governance programs, but we are still seeing great resistance to these efforts. One reason for that resistance is that users often misunderstand what "information governance" means. Many organizations equate governance with rules, regulations, and "Big Brother" (management that exerts excessive control) limiting flexibility in how the business handles data and what it can do with it. However, a more informative interpretation of information governance recognizes that the focus should be more on identifying what data is most useful to the business and its desired business outcomes, and on designing processes that are as flexible as the business needs them to be.
When asked what the term means to them, however, business users we regularly speak with have offered many different definitions, including: security, access, control, rigidity, limited flexibility, IT managers or "Big Brother" watching, extra work, "something focused on data that IT needs to work with," "something we are doing wrong (apparently)," and "not related to what we do in the business." (The very word governance, which implies control from above, may be partly responsible for those attitudes. Replacing it with terms such as "stewardship" or "custodianship" might help to allay any fears users have in that regard.)
These responses reflect a dated, negative view of what information governance is about. Today, a much different approach is called for. No one, for instance, should design a governance process that is rigid; instead, the process and supporting organization should be as flexible as the business needs them to be. Security and access, moreover, will be policies of interest to the work being done, but they should not be the only or even the main focus of governance. Instead, they are now secondary or tertiary concerns.
In addition, information governance should only be undertaken when a business has both a desire to first, govern data for the express purpose of realizing business value, and second, a willingness to change its business processes that create, enrich, approve, or otherwise use data, so that it can extract that value. For example, business users who had previously been reluctant to participate in the governance of customer data would be willing to do so if it would help them achieve their own, measured objectives.
Unfortunately, people do not always recognize the potential benefits of information governance. Consider the example of an employee like "Fred." You all know who Fred is. He has been with your supply chain group for years; he does not "like" IT and IT does not "like" him. But when it is 4:45 p.m. on a Friday and the information system will not allow you to ship an order, you go to Fred to find out how to get that order out the door. Fred knows that if you enter "00" in the field that is at fault, it will override the system and allow the order to ship! Fred is the authority and the informal steward of information. He and his like are governing information every day. But information governance today is not focused on stopping what Fred is doing. Instead, it is focused on understanding what is wrong with a process and its supporting application, and on changing them to enable better outcomes—such as shipping orders complete and on time.
As this all-too-common example suggests, we need to avoid the emotional inhibitions related to terms and concepts like "information governance" and just get the job done.
3. Information as an asset (balance sheet) and information value yield (profit and loss)
The growing hype about "big data" analysis is leading organizations to ask themselves: Is there any way we can monetize our information? Can we use information not only in our own business but actually sell some aspect of it to others?
Some companies are already doing this. A few years ago Gartner published a case study about how Best Buy was, at the time, selling access to application programming interfaces (APIs) that published product-attribute information for use in marketing aggregators' online shopping sites. This data originally was provided, in part, to Best Buy by its suppliers. Those same suppliers, meanwhile, were working on ways to monetize their own product-related data. Thus, while manufacturers and their retail partners may primarily sell products like televisions and Blu-Ray players, they can also create an additional revenue stream by selling some of their information.
The Best Buy example and more-recent stories, such as how Netflix is able to mine insights from when and how frequently customers pause their video players, illustrate how information can be accounted for as an asset. This concept is beginning to attract more attention. But information is not yet considered to be an acceptable intangible asset for accounting purposes, so the monetary value of a company's unique customer master list remains unaccounted for.
If you accept, however, that your organization's information assets have financial value, then a host of questions will open up. Which information asset should you invest in most? Which information assets and information management or exploiting programs will yield the greatest returns? Should you keep information assets on the assumption that they will pay you a higher return later? Do you invest in enterprise resource planning (ERP) or business intelligence (BI) systems, and in which order? What about master data management? These are hard questions to answer. But it is these questions your IT group must be able to answer—and does so (perhaps informally) as it communicates what its priorities are in support of a particular business goal.
4. Making information governance "stick"
To address the issues discussed above, companies finally are starting to have more down-to-earth conversations about data governance. Many leading and next-leading organizations are appointing or hiring "data stewards" and are establishing business process and business data owners and data governance bodies. They may subsequently adopt master data management technology, perhaps coupled with a business process management (BPM) initiative. The scope of such initiatives, moreover, is often dictated by a broad and strategic focus on supply chain performance. That's what has been happening in 2012 and 2013. Why, then, are we not hearing more about successful MDM projects?
In fact, there are MDM success stories, but not every implementation is going as well as everyone would like. One scenario we have been seeing recently is what I would characterize as being unable to make information governance "stick." The situation typically looks like this:
Implementation is complete.
Applications have been integrated; data is flowing.
We hired data stewards (within the business, in fact).
There was or is a governance board; they met a few times—we think.
Now it's three months since "go live" and the project team has disbanded.
Exceptions are emerging in the data, and the business users are coming to IT for resolution. IT does not know what to do with the exceptions, and business users can't understand the language of the messages.
There are two reasons companies find themselves in this kind of situation. The first is that some organizations are struggling to get sufficient buy-in for the new roles and responsibilities required for governance (policy setting) and stewardship (policy enforcement). They initiate the necessary work as part of the implementation but do not seem to carry through with it day to day.
The second is that too many so-called master data management software vendors and their tools are not mature enough to adequately support business-led data stewardship. When I worked in industry (in consumer goods, industrial manufacturing, and white goods) in the days before information governance had been formally defined, I figured out how to use product data to do my job better. Sometimes that meant discovering what kind of data exceptions could be used to override the system. But the tools I used were rudimentary, even manual. Today's master data management solutions would not have been useful to stewards of supply chain product data like me, or to my current-day counterparts. Too much emphasis is being placed now on data quality, matching, integration, and modeling. And too little is being placed on the monitoring and problem-solving tools that business data stewards need in order to carry out their day-to-day work.
The role of data steward, by the way, should not be an onerous one. In fact, it should not be a full-time job. If it is, then the organization is focusing on the wrong things. Problem solving for business process outcomes that are held back by data problems that the IT group cannot handle should take no more than a few minutes each week. How many minutes may differ for each organization—it might be 15 minutes, or 20, or 10. The number is not the point; the point is that this responsibility should take up a very small amount of time compared to the rest of a business user's work.
One other important point is that data maintenance is different from data stewardship. Too many users and vendors do not understand that these roles, and the work associated with them, can and should be separated. Who actually creates the data is not so important; that work could be done as a shared service, or it could even be outsourced. But the role of steward—that is, the chief problem solver—cannot be outsourced or removed from the line of business that is affected by the data problem.
Winning with data management
The four issues discussed in this article are the largest and most notable of the trends related to master data management and information governance that will play out in supply chains across the globe. There is one important point I must re-emphasize. The supply chains that will win in the next few years won't come out on top simply because they have the best information. All of them, I believe, will do something more with their data: They will successfully tie their information management disciplines to specific and measurable business outcomes.
As trading partners continue to deepen their collaborative relationships, seek to better understand their customers and end consumers, and focus on ever more demand-driven supply chain strategies, the consistency of the data that resides within corporate systems and is shared with partners will become even more critical than it is now. Businesses will need to govern their information to a degree that will ensure the integrity of their supply chain strategies—and master data management is where this is taking shape.
A hefty 42% of procurement leaders say the biggest threat to their future success is supply disruptions—such as natural disasters and transportation issues—a Gartner survey shows.
The survey, conducted from June through July 2024 among 258 sourcing and procurement leaders, was designed to help chief procurement officers (CPOs) understand and prioritize the most significant risks that could impede procurement operations, and what actions can be taken to manage them effectively.
"CPOs’ concerns about supply disruptions reflect the often unpredictable nature and potentially existential impacts of these events," Andrea Greenwald, Senior Director Analyst in Gartner’s Supply Chain practice, said in a release. "They are coming to understand that the reactive measures they have employed to manage risks over the past four years will not be sufficient for the next four.”
Following supply disruptions at #1, the survey showed that the second biggest threat to procurement is seen as macroeconomic factors, which include economic downturns, inflation, and other economic factors. While more predictable, those variables can substantially influence long-term procurement strategies.
And the third-most serious perceived risk was geopolitical issues, including tariffs and regulatory changes, and compliance issues, including regulatory and contractual risks.
In addition, the survey also revealed that “leading organizations” are 2.2 times more likely to view energy availability and cost as a top risk; indicating a focus on future emerging risks. As electrification drives demand for power, brittle grid infrastructure raises concern about whether the energy supply can keep pace. Therefore, leading organizations recognize that access to energy will become a significant future risk.
The market for environmentally friendly logistics services is expected to grow by nearly 8% between now and 2033, reaching a value of $2.8 billion, according to research from Custom Market Insights (CMI), released earlier this year.
The “green logistics services market” encompasses environmentally sustainable logistics practices aimed at reducing carbon emissions, minimizing waste, and improving energy efficiency throughout the supply chain, according to CMI. The market involves the use of eco-friendly transportation methods—such as electric and hybrid vehicles—as well as renewable energy-powered warehouses, and advanced technologies such as the Internet of Things (IoT) and artificial intelligence (AI) for optimizing logistics operations.
“Key components include transportation, warehousing, freight management, and supply chain solutions designed to meet regulatory standards and consumer demand for sustainability,” according to the report. “The market is driven by corporate social responsibility, technological advancements, and the increasing emphasis on achieving carbon neutrality in logistics operations.”
Major industry players include DHL Supply Chain, UPS, FedEx Corp., CEVA Logistics, XPO Logistics, Inc., and others focused on developing more sustainable logistics operations, according to the report.
The research measures the current market value of green logistics services at $1.4 billion, which is projected to rise at a compound annual growth rate (CAGR) of 7.8% through 2033.
The report highlights six underlying factors driving growth:
Regulatory Compliance: Governments worldwide are enforcing stricter environmental regulations, compelling companies to adopt green logistics practices to reduce carbon emissions and meet legal requirements.
Technological Advancements: Innovations in technology, such as IoT, AI, and blockchain, enhance the efficiency and sustainability of logistics operations. These technologies enable better tracking, optimization, and reduced energy consumption.
Consumer Demand for Sustainability: Increasing consumer awareness and preference for eco-friendly products drive companies to implement green logistics to align with market expectations and enhance their brand image.
Corporate Social Responsibility (CSR): Companies are prioritizing sustainability in their CSR strategies, leading to investments in green logistics solutions to reduce environmental impact and fulfill stakeholder expectations.
Expansion into Emerging Markets: There is significant potential for growth in emerging markets where the adoption of green logistics practices is still developing. Companies can capitalize on this by introducing sustainable solutions and technologies.
Development of Renewable Energy Solutions: Investing in renewable energy sources, such as solar-powered warehouses and electric vehicle fleets, presents an opportunity for companies to reduce operational costs and enhance sustainability, driving further market growth.
Keep ReadingShow less
Peter Weill of MIT tells the audience at the IFS Unleashed user conference about the benefits of being a "real-time business."
These "real-time businesses," according to Weill, use trusted, real-time data to enable people and systems to make real-time decisions. By adopting that strategy, these companies gain three major capabilities:
Increased business agility without needing a change management program to implement it;
Seamless digital customer journeys via self-service, automated, or assisted multiproduct, multichannel experiences; and
Thoughtful employee experiences enabled by technology empowered teams.
The benefits of this real-time focus are significant, according to Weill. In a study with Insight Partners, he found that those companies that were best-in-class at implementing automated processes and real-time decision-making had more than 50% higher revenue growth and net margins than their peers.
Nor is adopting a real-time data stance restricted to just digital or tech-native businesses. Rather, Weill said that it can produce successful results for any companies that can apply the approach better than their immediate competitors.
Weill's remarks came today during a session titled “Becoming a Real-Time Business: Unlocking the Transformative Power of Digital, Data, and AI" at at the “IFS Unleashed” show in Orlando, Florida.
For example, millions of residents and workers in the Tampa region have now left their homes and jobs, heeding increasingly dire evacuation warnings from state officials. They’re fleeing the estimated 10 to 20 feet of storm surge that is forecast to swamp the area, due to Hurricane Milton’s status as the strongest hurricane in the Gulf since Rita in 2005, the fifth-strongest Atlantic hurricane based on pressure, and the sixth-strongest Atlantic hurricane based on its peak winds, according to market data provider Industrial Info Resources.
Between that mass migration and the storm’s effect on buildings and infrastructure, supply chain impacts could hit the energy logistics and agriculture sectors particularly hard, according to a report from Everstream Analytics.
The Tampa Bay metro area is the most vulnerable area, with the potential for storm surge to halt port operations, roads, rails, air travel, and business operations – possibly for an extended period of time. In contrast to those “severe to potentially catastrophic” effects, key supply chain hubs outside of the core zone of impact—including the Miami metro area along with Jacksonville, FL and Savannah, GA—could also be impacted but to a more moderate level, such as slowdowns in port operations and air cargo, Everstream Analytics’ Chief Meteorologist Jon Davis said in a report.
Although it was recently downgraded from a Category 5 to Category 4 storm, Milton is anticipated to have major disruptions for transportation, in large part because it will strike an “already fragile supply chain environment” that is still reeling from the fury of Hurricane Helene less than two weeks ago and the ILA port strike that ended just five days ago and crippled ports along the East and Gulf Coasts, a report from Project44 said.
The storm will also affect supply chain operations at sea, since approximately 74 container vessels are located near the storm and may experience delays as they await safe entry into major ports. Vessels already at the ports may face delays departing as they wait for storm conditions to clear, Project44 said.
On land, Florida will likely also face impacts in the Last Mile delivery industry as roads become difficult to navigate and workers evacuate for safety.
Likewise, freight rail networks are also shifting engines, cars, and shipments out of the path of the storm as the industry continues “adapting to a world shaped by climate change,” the Association of American Railroads (AAR) said. Before floods arrive, railroads may relocate locomotives, elevate track infrastructure, and remove sensitive electronic equipment such as sensors, signals and switches. However, forceful water can move a bridge from its support beams or destabilize it by unearthing the supporting soil, so in certain conditions, railroads may park rail cars full of heavy materials — like rocks and ballast — on a bridge before a flood to weigh it down, AAR said.
Imports at the nation’s major container ports should continue at elevated levels this month despite the strike, the groups said in their Global Port Tracker report.
To be sure, the strike wasn’t without impacts. NRF found that retailers who brought in cargo early or shifted delivery to the West Coast face added warehousing and transportation costs. But the overall effect of the three-day work stoppage on national economic trends will be fairly muted.
“It was a huge relief for retailers, their customers and the nation’s economy that the strike was short lived,” NRF Vice President for Supply Chain and Customs Policy Jonathan Gold said in a release. “It will take the affected ports a couple of weeks to recover, but we can rest assured that all ports across the country will be working hard to meet demand, and no impact on the holiday shopping season is expected.”
Looking at next steps, NRF said the focus now is on bringing the International Longshoremen’s Association (ILA)—the union representing some 45,000 workers—and the United States Maritime Alliance Ltd. (USMX) back to the bargaining table. “The priority now is for both parties to negotiate in good faith and reach a long-term contract before the short-term extension ends in mid-January. We don’t want to face a disruption like this all over again,” Gold said.
By the numbers, the report forecasts that U.S. ports covered by Global Port Tracker will handle 2.12 million twenty-foot equivalent units (TEU) for October, which would be an increase of 3.1% year over year. That is slightly higher than the 2.08 million TEU forecast for October a month ago, and the strike did not appear to affect national totals.
In comparison, the August number was 2.34 million TEU, up 19.3% year over year. The September forecast 2.29 million TEU, up 12.9% year over year, November is forecast at 1.91 million TEU, up 0.9% year over year, and December at 1.88 million TEU, up 0.2%. For the year, that would bring 2024 to 24.9 million TEU, up 12.1% from 2023. The import numbers come as NRF is forecasting that 2024 retail sales – excluding automobile dealers, gasoline stations and restaurants to focus on core retail – will grow between 2.5% and 3.5% over 2023.
Global Port Tracker, which is produced for NRF by Hackett Associates, provides historical data and forecasts for the U.S. ports of Los Angeles/Long Beach, Oakland, Seattle and Tacoma on the West Coast; New York/New Jersey, Port of Virginia, Charleston, Savannah, Port Everglades, Miami and Jacksonville on the East Coast, and Houston on the Gulf Coast.