This extract from The AI-Powered Enterprise: Harness the Power of Ontologies to Make Your Business Smarter, Faster, and More Profitable
by Seth Earley is © 2020 and reproduced with permission from Lifetree Media.
Enterprises are like organisms in an economic ecosystem. The principles that enable a healthy biological ecosystem are, from a physical, chemical, and informational perspective, identical to those that enable a healthy business ecosystem and that ensure the survival of members of that business ecosystem. Value is created by solving problems through the application of information and creativity. By speeding the information flows and reducing inefficiencies, we are equipping our part of the bigger picture to operate effectively, adapt quickly, and evolve to meet competitive threats and exploit opportunities in the environment.
Supply chains are a crucial and complex part of the information flowing in this ecosystem. They are intricately structured and variable systems that are highly sensitive, with many possible outcomes based on even minor changes in the initial conditions or components. Supply chains feature a large collection of interacting components that are difficult to understand or examine due to their design and operations. And they represent a system in process, changing and developing over time.
It’s critical to think holistically about this information ecosystem as you prepare the digital representation of various stages of product design and development. Even a product designed in isolation from other systems and groups—whether in a specialized department or in a separate contracting organization—is still part of an information ecosystem. Information that may be inconsequential to the group that is creating the product, such as an obscure material specification that has no immediate value, will likely have value either downstream (perhaps to a distributor or engineering group) or upstream (perhaps to a procurement manager or supply chain manager).
Too often, these unseen dependencies and information relationships are neglected, and the impact of this neglect can be significant. If a piece of data that will be needed when assembling or distributing a future product is not captured, is lost, or is incorrectly represented, the cost of remediation is orders of magnitude larger than that of addressing the data need at the source.
Of course, it is difficult to know what will be important in the future without mapping out the information supply chain. Today’s manufacturers and product designers do not simply design and manufacture physical goods. They design and manufacture data streams and data specifications that are as important as the good itself. But this requirement is not always well considered at the time of design. A marketer may need a piece of data that resides in engineering. Getting that data after design teams have moved on or personnel have shifted priorities is difficult and costly.
It is not feasible to capture every piece of data that could potentially be useful for an unknown downstream purpose. Instead, you need to map the data flows that correspond with the physical and manufacturing flow and collaborate with downstream consumers of the data to understand and anticipate needs. Then capture and manage that data provenance in the right structure and application and in compliance with data quality standards.
Design, manufacturing, and marketing groups need to be aware of downstream processes. Each department and group must understand how the “data exhaust” produced by their processes is going to inform both upstream and downstream systems. Your data exhaust is someone else’s data fuel.
For example, in life sciences research, antibodies are manufactured through certain processes and the data associated with those processes is critical to end users. But even more important are the ways that fellow researchers use a particular antibody in experiments that have been written up in peer-reviewed journals. How do other researchers use the associated reagents? How well did they perform under certain protocols? What were the upstream manufacturing processes? What are the downstream applications? Where did they not perform?
For your enterprise, there are similar questions. How do your processes fit in with the larger business objectives, marketing strategy, customer education, and organizational processes? What information is important to customers, competitors, and suppliers? What are their roles in the information ecosystem? Mapping out and understanding these dependencies is critical to optimizing information flows beyond the immediate needs of the process at hand. Understanding and planning for these needs will help your organization differentiate based on a deeper understanding of the data. This is how your organization turns hidden data flows into a competitive advantage.
Distribution of physical goods includes distribution of data
Once products are manufactured, they need to be distributed from the point of manufacture to the point of usage. For traditional retailers, goods are moved from manufacturer to a distribution center or warehouse and then to a retail store or directly to the consumer. For business-to-business manufacturers, the supply chain can be immensely complicated, with distributors and routes to market through other manufacturers, who in turn create their products using components sourced through other manufacturers and distributors within a highly complex web of relationships.
Large brands can have tens of thousands to hundreds of thousands of suppliers. Consider the manufacturer of a complex machine such as an aircraft. An Airbus A380 contains 4 million parts made by 1,500 companies. The global supply chain is a complex system with many variables and influencing factors. Durable manufactured products can have decades-long lifespans, and companies need to stock replacement parts—or be prepared to manufacture them—throughout the product’s usable life. This also means they need to manage the associated data throughout the product’s life.
While technologies, manufacturing techniques, and the sophistication level of products have become more advanced, the desire for variation and customization has increased the cost and difficulty of managing a diversity of suppliers. Competitive pressures have shortened product cycle times and accelerated fulfillment logistics while reducing inventory levels to save carrying costs. There’s an enormous flow of goods and items that are highly esoteric and specific to an industry or a process, but that flow has metadata and identifiers for everything in it. Every object in your house requires a chain of manufacturers who in turn depend on other manufacturers to provide tooling, parts, and materials to create their products.
Every one of these components has a metadata lifecycle that has flowed through the processes: from concept, through design and acquisition of raw materials, through manufacturing, and across multiple distribution and logistics channels. Every physical item has an associated information lifecycle that tracks how and where it originated, where it was distributed, and how it made it to the point where it is put to use.
Efficiencies in the physical movement of goods require efficiencies in the associated data flows. Tighter coordination of supplier logistics requires better integration of the data between suppliers. Organizations that want to improve the efficiencies of supply chains need to improve the efficacies of information exchange. But this also requires greater transparency and trust with trading partners. Many large organizations deal with new vendors on a weekly basis. According to one source, a food manufacturer dealt with 1,000 vendors for a single line of lasagna. Combine this level of complexity and volume with a lack of transparency to upstream suppliers, and problems with safety, quality, and ethical sourcing become inevitable, generating public relations disasters that can destroy brand trust and significantly impact the future of an organization.
In supply chains, artificial intelligence (AI) can help locate interchangeable parts or substitute components, materials, alternate formulations, or ingredients. It can gather and consolidate supplier data from multiple diverse sources to ensure a holistic understanding of their practices. It can also analyze agreements, past purchases, and quality trends along with service-level agreements that would require costly, difficult-to-scale human analysis.
The key to effective AI is to have an ontology1 that defines the correct data elements for vendor and supplier qualifications, services, terms and conditions, and historical performance. Without a single source of supplier truth, this type of trend analysis is not feasible.
In fact, many organizations with complex trading partner networks are building standards that enable transparency and traceability throughout the entire supply chain. One startup called EVRYTHNG aims to create a digital identity for every single one of the 4 trillion consumer products manufactured every year, based on international standards. EVRYTHNG identities will enable tracking of not just every type of product but every individual item. The platform creates what the company calls an “active digital identity” on the web that can be accessed through a QR code or near field communication tag on the item. This digital identity includes metadata describing the object and its whole journey from creation through distribution, eventual sale, and in some cases recycling—indicating where it is as well as who has interacted with it.
One immediate application of this technology is the tracking of products in supply chains. The resulting tracking data not only provides insight into where counterfeit products are being made, it also allows companies with thousands of suppliers to see which suppliers are productive and which products are excelling. Ralph Lauren is using this technology to track clothing and consumer goods; a large seafood company called Mowi uses it to track fish products from the fish farm to the supermarket or restaurant.
Consumers will also be able to interact with their own products, see where they came from, and access digital services linked to the products. As Niall Murphy, CEO of EVRYTHNG, explains, “Bringing large-scale data science to manufacturing and supply chain traceability is transformative.” And given this mass of real-world data, applying artificial intelligence is how companies will see the patterns and use them to improve efficiency and gain insights.
Supply chain data is at the core of transparency. The question is how to identify and prioritize the correct data elements for monitoring and management. This is done by assessing risks according to product category and severity of impact, and those classifications are controlled vocabularies that are managed in the ontology.
Procurement organizations need to understand and monitor the critical data elements and to include not just pricing, delivery logistics, quality measures, and specifications but also data standards as part of their service-level agreements with suppliers. Since procurement understands the broader landscape of suppliers, it is incumbent upon that department to enforce data standards that will be leveraged by multiple downstream users and processes. If data is not in the correct format or is missing elements or of poor quality, then penalties need to be assessed just as they would be for any other product deficiency. Educating procurement on the downstream impact of data issues is critical to the optimization of the supply chain.
If you are a manufacturer that works through a network of distributors, having your data supply chain aligned with the needs of downstream consumers is even more important. You are ingesting data from your suppliers and enriching it with additional merchandising or application data elements. You may be destroying data as you combine components into finished products, but you may also need to maintain traceability and provenance in the event of manufacturer defects and part recalls. Having traceability in the supply chain and understanding which elements can be lost or transformed is critical to demand prediction and management, source replenishment, and understanding how customers buy and use your components or assemblies of components.
Ontologies enable standards
Ontologies are standards. They contain common terminology as well as data elements that allow for consistency in information structures across applications. That allows information to flow more smoothly and without manual translations, mapping, or manipulation when viewing or consolidating information from different systems.
GS1 is one well-known standards organization that works across sectors. GS1 includes multiple types of identifiers such as the ubiquitous bar codes that allow retail scanners to work, plus standards for locations, assets, documents, shipments, coupons, components, and parts. It also includes standards for information exchanges, including transactions, electronic data interchange, and product master data. (The digital identity that EVRYTHNG creates is based in part on the GS1 standard.)
Standards allow for the fast movement of items through supply chains and for organizations’ quick and efficient tracking of inventory and transactions. These externally facing and public standards are critical to the efficient interchange of information, and they reduce the costs of transactions and data aggregation. But that does not mean that everything is made public. Internal standards can improve efficiencies even when they are not shared with other parties due to the proprietary nature of trade secrets. The organization will need to build internal standards that apply to processes, procedures, manufacturing techniques, formulas, and other differentiators.
Embracing a common language and shared mental model, including appropriate standards, is part of the culture and character of the organization. A cookie-cutter, off-the-shelf standard, or one appropriated from another organization, will not fit the work style and personality of teams that have formed deep and productive working relationships. Groups should own the detailed areas of knowledge and how that knowledge is organized while also following corporate practices and well-accepted approaches to building out effective taxonomies and ontologies. This becomes a collaborative exercise that increases employees’ awareness of interdependencies and of the role and value of other parts of the organization.
B2B distribution transformation requires discipline
In distribution, any time you’re moving things around, you have to predict demand. In a complex system like this, variations in seemingly unrelated areas in the physical, political, and human world can have outsize impacts on supplies and market demands. Variations in weather patterns, trade issues, and manufacturers of esoteric ingredients or minor components can make it hard to know how much of anything you need at one time in any location.
Machine learning and AI applications can make sense of resource management inputs and parameters and can help to identify anomalies. This contributes to determining where to allocate resourcing and spare parts inventories, or how to hedge risks in critical supply elements. By anticipating and correlating seemingly unrelated factors to map replacement and substitute parts and ingredients, you can mitigate disruptions. Success here is dependent on historical data, human judgment, and an ontology that contains product, component, assembly, and other relationships that inform AI programs.
Smart objects
As more physical goods are sensor-enabled, dumb, standardized manufactured commodities can be imbued with differentiated value. How much value depends on how that data is leveraged. Consider the types of questions that smart objects in the supply chain can answer:
- What features are customers using?
- How many units of the product are being used in the marketplace?
- How is the product performing—what are the effects of wear, stresses, unusual or extreme conditions, failure rates, and efficiencies?
- Where is the product in the downstream channel? (At the warehouse, at the distribution center, on the manufacturer factory floor, in the finished good, at the dealer, in transit, at the final destination?)
- In what products is the component being assembled?
- What application is it being used for?
- How is it performing in a system of other components?
Based on these data points, it will be possible to offer new services for smart devices. For example, you could guarantee performance based on field data, maximize uptime for devices using the component, optimize systems of components based on conditions, refine functionality based on user feedback, or enable control of devices by remote operators.
Preventative maintenance based on failure predictors
It is possible to create new business services and add new value propositions based on data from devices. For example, a manufacturer might not normally have a field service group but could offer to use performance monitoring to prevent maintenance outages. The device informs the home office that it is beginning to show wear patterns through various vibration and sound signatures. Rather than wait for a failure, the manufacturer or vendor upgrades the part—perhaps on a subscription basis.
For B2B manufacturers and distributors to offer these services, they need to understand their products’ functionality at a data level and understand how their customers plan on using their products.
Smart spaces
Buildings are another place where the physical intersects with the virtual. We can optimize buildings for human interaction and collaboration, commerce, process efficiency, safety, and operating costs. We can get the best of all of these worlds with instrumented buildings equipped with sensors and mechanisms to track physical traffic.
Retailers are analyzing human behavior and understanding how people move through a store and find what they need. This is done through video monitoring or through opt-in beacon technology that provides an incentive (coupons, discounts, or other loyalty awards) for shoppers, allowing the retailer to track their preferences. Purchase behaviors can be correlated with in-store traffic patterns and influencers. Retailers can assist wayfinding through public spaces and create interactive applications that lead individuals to exactly the shelf and product that they need.
Virtual reality and the internet of things
Engineers are beginning to design advanced, connected features into products, including virtual reality integrated with maintenance, bots that provide instruction and answer questions, internet-connected sensors, monitoring, diagnosis, prediction, control, optimization, and autonomy. These features make a big difference.
For example, my home has a generator, and since it is a decade old, the company needs to send a technician out to check on its operation and service it. The newest models do not have that antiquated requirement. They simply call home with their operating parameters and tell the supplier when they need attention. This is what manufacturing executives and managers need to prepare for: the integration of self-diagnostic and reporting capabilities, but also the necessity of managing the deluge of data from their devices.
Ontologies and content componentization are especially important when developing content that feeds applications such as virtual reality instructional materials. It is now possible to overlay design specifications on the physical part that needs to be replaced, repaired, or adjusted. That requires a content model, or content architecture, that can be assigned with terminology and identifiers that match the product to the appropriate design guides and training materials. This means that an ontology has to contain the right values for parts and instructions. Machine vision systems must have the capability to visually identify the correct component in sometimes extremely complex and difficult to access physical environments.
Intelligence can also be embedded in machinery so that, for example, a sensor-enabled device could report back operating parameters and signatures for vibration, sound, and heat that could then be matched with reference data to indicate that the machinery is in need of maintenance or replacement. The machinery could even have the intelligence locally available to assist the technician in making repairs, based on documentation updated with the latest techniques, diagnostic software, and calibration from its remote connection to the factory.
When enough products have these features, entire industries can be transformed. For example, AI is even enabling autonomous operation of huge mining operations. These operations use systems of equipment that come from different manufacturers but that operate as a coordinated set of machines to monitor their shared operations, reduce human exposure to dangerous conditions, and reduce operating costs. The entire mining lifecycle leverages analytics, machine intelligence, and autonomous equipment to optimize operations and reduce human labor.
The insights that come from instrumenting and tracking physical objects enable companies to monitor and improve their strategies in real time.
Takeaways
The relationships between the physical world and the digital world will transform how companies operate, at scales ranging from the molecular to the massive operations of mining industries. AI can help to optimize and make sense of supply chain dynamics, work to differentiate commodity products, and use sensor data in a variety of ways that improve efficiencies. With the right ontology and data structures behind these initiatives, everything businesses do is trackable, and is therefore subject to improvement through AI techniques. This is the future of manufacturing, supply chains, and physical spaces—where everything is digitally enhanced. These are the main points in this chapter:
- If you do not have the product metadata, your products will become invisible in distribution.
- Appropriate data can create efficiencies and transparency in supply chains, but that requires adherence to standards and cooperation among manufacturers and suppliers.
- Connected, instrumented components can enable new functions, such as predicting failure in the field.
Note:
1. In the first chapter of the book, Earley defines “ontology” as a representation of what matters within the company and makes it unique, including products and services, solutions and processes, organizational structures, protocols, customer characteristics, manufacturing methods, knowledge, content, and data of all types.