Peter Bilello, Author at Engineering.com https://www.engineering.com/author/peter-bilello/ Fri, 07 Feb 2025 18:46:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png Peter Bilello, Author at Engineering.com https://www.engineering.com/author/peter-bilello/ 32 32 The impact of digital transformation and how to manage disruptions https://www.engineering.com/the-impact-of-digital-transformation-and-how-to-manage-disruptions/ Thu, 12 Dec 2024 18:52:10 +0000 https://www.engineering.com/?p=134836 A report on PLM Road Map & PDT Europe 2024 event in Gothenburg

The post The impact of digital transformation and how to manage disruptions appeared first on Engineering.com.

]]>
Every well-run technical conference advances in content over its predecessors, as demonstrated by the CIMdata PLM Road Map / Eurostep presentations recently given in Gothenburg, Sweden. Fading away were prior years’ discussions of the ins and outs of product lifecycle management (PLM) and solution providers.

Instead, speakers focused on how digital transformations impact their organizations and how the disruptions are managed. At prior conferences, these approaches usually got little attention.

Taken together, the presentations addressed the learning curves of digital transformations in power generation, pharmaceuticals, nuclear reactors, and wholesale grocery distribution. And they showed the breadth and depth of digital transformation’s penetration into the global economy.

The Gothenburg conference was titled “Value Drivers for Digitalization of the Product Lifecycle: Insights for the PLM Professional.” As an introduction, a high-level overview of the conference showed that digitalization can be shown to boost revenues and growth and extend manufacturing capabilities while tightening control over information for use and re-use over decades; two AI consortiums were also described.

One presentation that resonated with many attendees was from Mr. Peter Vind, enterprise architect, Siemens Energy, who focused on Business Capability Models (BCMs) as a way to prepare for changes (“both big and small, good or bad”) and to enhance the Siemens Power business model, Power as a Service (PaaS). Mr. Vind explained that BCMs define what the company needs to do to achieve its business strategy goals; the BCM serves as a bridge between business needs and IT and structures capabilities into logical clusters and lays the groundwork for enterprise transformation.

Built on elements of PLM’s digital threads and aspects of digital models, Siemens Energy’s BCMs manage the lifecycle from inception through engineering, design, manufacture, service, and disposal, Mr. Vind said. This forms an information backbone that provides consistent, accurate, and up-to-date information and manages design and engineering changes.

Mr. Anders Romare, chief digital and information officer at Denmark-based Novo Nordisk, focused on how this pharmaceutical giant uses digital transformation to shorten the drug discovery process by several years. He stressed that Novo Nordisk is focused on reaching more patients, manufacturing capacity, and engines of sustainable growth.

Elements include:

LabDroid robotics, which uses model-driven advanced analytics and connected technology to make research projects smarter and faster.

DataCore, an “ecosystem of source systems,” a new shared-data foundation that is scalable, secure, and supported by data governance.

NovoScribe to automatically generate structured clinical-development documents: it is already 70% faster than previous methods.

Quantum computing, the focus of a US$200 million Novo Nordisk Foundation investment, to leverage Gefion, Denmark’s first AI supercomputer, with a community of users.

The opportunities and challenges of AI and automation—the Digital Core—were spelled out by Mr. Gary Langridge, engineering manager / digital thread in the Ocado Technology unit of the UK-based Ocado Group, an online retail and wholesale grocer and technology provider.

Ocado is shaping the future of e-commerce and logistics, but Mr. Langridge said there are many challenges to ensuring that customers get safe, fresh foods delivered on time. Ocado sees this as a great value compared with conventional grocery operations.

He said over 10,000 “fulfillment” robots are used in Ocado’s Hives (distribution centers), each of which can pick a 50-item order in under five minutes. He added that Ocado Hives can handle hundreds of orders simultaneously. Ocado’s robots can handle a diversity of packages, including fragile items, a variety of weights, shapes, and sizes, and they pack densely and quickly, he noted.

The system currently handles 50% of Ocado’s range of goods (out of about 100,000 SKUs) and will reach 70% by 2026, Mr. Langridge added. An AI-based “air traffic control” system with proprietary wireless technology orchestrates the Hives; Ocado’s robots are not autonomous.

Ocado’s Digital Core deployed PLM tightly integrated with organizational change management, system governance, and foundational processes. “The digital thread handles the horizontal integration of data,” Mr. Langridge said, “but we’re still working on vertical integration” and “finding the right partners for cultural fits.”

CIMdata’s Aerospace & Defense PLM Action Group (AD PAG) shared its research-based insights on Model-Based Systems Engineering (MBSE) and its reality and promise. Presenters were Mr. James Roche, AD PAGpractice director of the CIMdata-administered PLM advocacy group, and Mr. Sandeepak Natu, co-director of CIMdata’s simulation-driven systems development (SDSD) practice.

They noted that MBSE is gaining traction in A&D due to increasing product complexity, the growing role of software in products, and the U.S. Department of Defense’s digital engineering strategy; hence, investment in MBSE is expected to continue growing.

The presentation pointed out that MBSE is found mainly in conceptual design and development, such as requirements definition and allocation, system architecture definition, and design verification and validation. It is expected to expand into production, utilization, and support with the growth of technologies like IoT/IIoT and digital twins. MBSE is also expected to move into software design, product line engineering, and design for safety and security.

In A&D, MBSE is still in the early adoption phase, and maturity levels vary significantly. Successful MBSE adoption requires a well-defined vision and strategy, strong executive commitment, appropriate methodologies, the development of MBSE expertise, education and training initiatives, middle management support, and robust tool integration and standardization. The biggest challenges of MBSE implementation are its organizational impact and cultural resistance.

TheAD PAG survey found a crucial need for enhanced interoperability between tools and platforms, open application program interfaces (APIs), and adherence to standards. Respondents also said they wanted enhanced user interfaces, more capable change management, and stronger tools for change impact analysis. The survey also revealed that MBSE investment justifications are shifting from immediate returns toward long-term strategic value considerations.

The push for high-speed innovation is delayed by speed bumps such as changes in requirements, supply chain bottlenecks, and ever-increasing product complexity. Dr. Uyiosa Abusomwan, senior global technology manager for digital engineering and conference keynoter for Day Two, outlined how Eaton Corporation navigates these market dynamics with a novel approach to product development.

Dr. Abusomwan outlined Eaton’s “Digital Engineering Deployments ” within its infrastructure of connected engineering tools.

This infrastructure relies heavily on:

Model-Based Engineering (MBE) to analyze product performance such as deformation, critical stresses, fatigue, thermal impacts, which take into  consideration the design rules and insights into costs, manufacturing, and sustainability.

Intelligent Automated Design that integrates AI and rule-based automation to execute the complete product design process that includes material selection, design calculations; analyses; cost and lead-time simulation using aPriori, and AI-enabled optimization of parameterized models, reduced-order modeling, or generative design.

Furthermore, Dr. Abusomwan presented the impacts of Digital Engineering on three Eaton product lines:

• A 4-fold performance gain and 80% weight reduction in intelligent high-efficiency, light-weight heat exchangers.

• Using AI and multi-physics parametric optimization, an 87% reduction in the automated design time for lighting fixtures.

• A 65% reduction in autonomous digital design time for high-speed gears used in electric vehicles.

The challenges of operating enterprise-scale PLM implementations were addressed by Mr. Jorgen Dahl, senior director of PLM at GE Aerospace. He covered ensuring that the existing system works to its full capability and is stable and usable by all who need it. And this is amid continual modernization to accommodate nonstop business transformation, Mr. Dahl said, delivering new digital thread capabilities across the enterprise and accommodating innovations such as virtualized infrastructure automation.

Mr. Dahl urged his peers to set “uncomfortably ambitious” goals of 75% to 99.9% improvement. “It’s not enough to define the strategy and share it,” he observed. The strategy “has to be shared early and often because the minute after you are done sharing it, understanding of it may start to deteriorate and get reinterpreted.” At the heart of his presentation were four things to do and corresponding “do-nots:”

• Imagine the ideal end state and do NOT set goals based on current constraints.

• Ask “what must be true” and do NOT proceed without clarity.

• Create a multi-phased development strategy and do NOT use return on

  investment (ROI) calculations to drive short-term goals; as a substitute for a

  holistic strategy, an ROI focus could make things worse, not better.

• Every new capability must lead to the end goal rather than developing short-

  term improvements with no clear path forward.

“This may appear like common knowledge to many,” Mr. Dahl cautioned, “but over 30 years of observation suggests it is not.”

Dr. Rob Bodington, Eurostep technical fellow speaking about ShareAspace, outlined how defense contractors can be sure of the contents of their Digital Twins and Digital Threads. To outline the use of commercial frameworks for intellectual property—again over today’s long-lived and highly complex weapons platforms and systems—he used an acronym, SMART.

Dr. Bodington defined SMART as unambiguous Semantics in requesting information, Measurable and enforceable data key performance indicators (KPIs) in contracts, Accuracy and precision in specifying what is asked for, Reasoned requests for only what is necessary, and being Timely about when data is needed. Added to these requests, he noted, is compliance with security regulations, maintaining trust, respecting commercial constraints, and developing sound ontologies.

There are “way too many” standards, he said, with many overlaps. And while standards reflect the collective experience of an industry, he pointed out, they sometimes address “problems you didn’t know you had.”

AI “will help,” Dr. Bodington continued, by generating and exploiting ontologies while classifying data and exploiting its contexts. AI can also generate and clarify terminology in contracts.  

Dr. Erik Herzog, Technical Fellow at Saab Aeronautics, provided an update on the Heliple-2 project to create federated PLM capabilities that are interoperable and a feasible alternative to monolithic PLM systems. Heliple-2, he explained, uses the Open Services for Lifecycle Collaboration (OSLC) standard, its related Genesis architecture, and the STEP standards.

Heliple-2 addresses a core information challenge in the 50-year lifecycles of weapons systems—their development systems are commonly replaced three times, Dr. Herzog pointed out.

Working on the Heliple-2 project are Eurostep, Saab, Volvo, IBM, LynxWork (a startup using OSLC for integration and traceability in engineering software), Sweden’s innovation agency Vinnova, and KTH, Sweden’s Royal Institute of Technology and its largest technical university.

Using LynxWork and OSLC, plug-and-play integration can be implemented in a week, he said. The next steps for the program are industry-scale validation, backward navigation for links under configuration management, and demonstrating analysis capabilities spanning multiple applications, Dr. Herzog concluded.

Dr. Cristina Paniagua, a Luleå University of Technology researcher, addressed shortcomings in commonly used tools and solutions. She spelled out how the Swedish-Finnish Arrowhead flexible Production Value Network (fPVN) initiative is integrating traditional standards with emerging technologies in PLM, ERP, data management, and interoperability. This initiative has goals similar to Heliple-2.

Continuous evolution and integration are essential, she concluded.

Image: A conceptual view of the Arrowhead fPVN project as discussed by Dr. Cristina Paniagua. Note the centrality of CIMdata’s lifecycle optimization hexagon.  Image courtesy of TacIT

Interestingly, most of the presenters in Gothenburg delved into how their organizations are improving the management of product data and information. Many zeroed in on using PLM in digital transformation and how PLM is increasingly recognized as essential to overcoming data complexities and frustrations.

In my opening remarks, I reiterated that successful digital transformation requires a holistic, end-to-end approach to connectivity, strategies, and tactics that must address the organization’s issues with its people, processes, and the technologies it uses. I also stressed the criticality of organizational change management in maximizing value delivery. And I covered recent developments in PLM itself, such as AI, that enhance the value that can be realized from investment in digitalization, anticipating customer demands as they evolve, and market opportunities—all these motivate investing in digitalization of the product lifecycle.

Final thoughts on PLM value

As PLM professionals, we must continually seek to enhance the value resulting from the digitalization of the product lifecycle. This requires keeping an eye on the evolving trends and enablers of digital transformation, which are comprehensively laid out in CIMdata’s Critical Dozen.

In Gothenburg, it was repeatedly emphasized that maximum value can only be obtained from a holistic, end-to-end approach and that organizational change management plays a critical role in maximizing the adoption of digitalization and delivering the expected value. Ultimately, investment in the digitalization of the product lifecycle is motivated by evolving customer demands and newly uncovered market opportunities.

The post The impact of digital transformation and how to manage disruptions appeared first on Engineering.com.

]]>
The seven premises of successful digital transformation https://www.engineering.com/the-seven-premises-of-successful-digital-transformation/ Fri, 27 Sep 2024 19:10:06 +0000 https://www.engineering.com/?p=132253 The high failure rate for digital transformation projects is scary. Here’s one way to improve your chances of success.

The post The seven premises of successful digital transformation appeared first on Engineering.com.

]]>

For many companies, Digital Transformation (DT) has become a nonstarter—no surprise given it’s negative track record. According to studies conducted by McKinsey & Co., Boston Consulting Group, KPMG, Bain & Co., and business-media publisher Forbes, the risk of failure in DT projects is between 70% and 95%.

So why keep trying? Isn’t digitalization enough? The answer is “no”, but it’s a good start. Digitalization converts analog data and information into 1s and 0s so they can be accessed and read by any digital toolset or application; the goal is to make all of an enterprise’s information available to anyone in the workforce who needs it and is approved to use it.

Ultimately, DT completes the process of digitalization by digitally connecting the enterprise throughout its products’ lifecycles so that it can continuously transform itself. DT is about enabling dramatically improved processes, new business models and new value-added products and services to give you a competitive advantage. DT’s goal is to create growing value everywhere in the organization with more competitive products and services, speedier production and deployment and more effective service—all while fostering collaboration and innovation.

DT accommodates and fosters continuous change and radically new digital work environments, placing tough new demands on the workforce while helping them adapt and succeed. DT leverages the organization’s digitalized knowledge base to levels previously out of reach.

Sounds wonderful, right? So, how do we get started? How do we successfully digitally transform when so many have failed in the past?

Here, I offer seven DT Success Premises. These premises represent keys to successful DT and were formed during CIMdata’s four decades of untangling data and information.

We begin with two blunt statements.

The first Success Premise states:

Digital must be at the core of your company. To be successful, digital (i.e., the availability of valid digital information and process enablement) must cut across all the enterprise’s departments and even include development partners, suppliers and customers, as well as the end-to-end product lifecycle.As a result, we—those of us who deal with DT every day—must continue to understand how PLM and other digital enabling strategies and tools are evolving and stay ahead of them.

To this, we must add a second Success Premise, likewise the fruit of four decades of focused work with data and information, which reads:

Digital strategies need to be built on a solid foundation of business justification, as well as a set of strategy elements that have been designed to evolve with the business. As a result, we must know and promote how PLM is a major foundation of your company’s DT and other business-critical elements.

We will explain five more Success Premises after we look into why DT faces so much resistance. The resulting workforce problems include having to:

• Jot down critical information from their workstations and operations on paper forms or sticky notes.

• Protect information in obsolete and cumbersome formats—on paper or digital.

• Work with organizational information they know is incomplete and thus not fully trustworthy.

Little wonder, then, that factory floor workforces will be among DT’s biggest beneficiaries. This brings us to CIMdata’s third Success Premise:

Implementing PLM and other digital enabling solutions is “like performing open-heart surgery on a person while they run in a marathon.”

Corollary: “You must strive to keep the complex simple.”

Because DT seems to take forever, the fourth Success Premise reads:

Digital is not something you implement overnight; as a result, what you define today may not be appropriate tomorrow. Therefore, flexibility, configurability and sustainability are critical. DT enablement with PLM is that and more, and it must be communicated early and often.

The fifth Success Premise addresses endless change:

DT requires a company’s PLM strategy and associated roadmap and support to be robust and flexible … Rome wasn’t built in a day … DT is a journey.

The sixth Success Premise:

The evolving nature of the typical enterprise and how digital strategies should be defined and implemented should also be handled in a sustainable manner that naturally addresses change.

Corollary: “Change happens; you might as well embrace it.”

The use of the word “enable” in the fourth Success Premise should not be overlooked. To be effective, DT must focus on workforce enablement. In other words, the finalization of DT is not implementation, as if something is done by uploading blocks of code into databases, but an enablement— as is virtually anything accomplished with PLM. The new way of working and the new processes enabled by new technologies must not just be implemented; they must be embedded in the organization’s culture. This isn’t a one-time action; it requires an ongoing state of continuous enablement and improvement.

The seventh and final Success Premise addresses terminology:

Don’t be afraid to call PLM something else. PLM by any other name is still PLM, but you may need to stop running into the brick wall. Either remove the wall or go over or around it.

You might be asking yourself, why would I, as the CEO of the leading PLM strategic management consulting company, say something like that? Because as noted in the first Success Premise, DT must reach all the enterprise’s departments and development partners, suppliers and customers throughout the entire product lifecycle. Many essential enterprise units are skeptical about PLM, believing it’s needed only for product development and the engineering department. They mistakenly fixate on the “P” in the PLM abbreviation rather than on the “L.” From an overall organizational or enterprise standpoint, “product” implies a focus limited in ways that “lifecycle” is not.

Corollary: “You can only run into a brick wall so many times before you break your collarbone.”

To sum up the seven Success Premises, we must always bear in mind that as a digital strategy is built, it must be communicated early, often and firmly to the entire workforce. Most new technology can be imposed—implemented top-down. But, with or without PLM, digital enablement is best done bottom-up.

Rationale

These seven Success Premises show why expertise and experience are indispensable, even crucial, in any transformational change.

The surging importance of expertise and experience recognizes, perhaps belatedly, that digital is endless in its reach and depth. This means finding it, getting access to it and transforming it is a bigger challenge than expected at the outset.

Until now, too much DT discussion has focused on questions such as:

– What makes each solution provider’s offerings superior to those of its competitors?

-How satisfied everyone will be once enablement is complete, however “complete” is defined.

– What is likely to go wrong?

As we better understand our DT challenges, changing information-handling practices is ever more important—especially on the factory floor and out in the field. The following conclusions crystalizes this.

Conclusion: critical success factors

Finally, I’d like to share the insights CIMdata has gained from participating in hundreds of digital transformation (DT) and PLM enablement projects. These seven key points of critical advice are as follows:

• Use a broad vision and approach: people want business solutions, not another system.

• Educate senior management and the initial team.

• Support and do not undermine company culture.

• Seek partners: people who understand your business needs and who have proven solutions & track records.

• Scope should be well-defined, clearly understood and under change control; required functionality must be precisely specified at each stage.

• Use pilot projects as the key to success.

• Seek to continually learn and adjust as required.

Ultimately, DT success is a multi-variable equation that is the sum of the enterprise’s DT vision, the organization put in place to support it, the processes included, the solution providers chosen to support it, the approach taken and the technology and process environment.

The post The seven premises of successful digital transformation appeared first on Engineering.com.

]]>
Where and how PLM fits into a digital transformation initiative https://www.engineering.com/where-and-how-plm-fits-into-a-digital-transformation-initiative/ Mon, 19 Aug 2024 15:20:22 +0000 https://www.engineering.com/?p=130962 Is there a path towards a truly integrated and collaborative manufacturing environment?

The post Where and how PLM fits into a digital transformation initiative appeared first on Engineering.com.

]]>
It is entirely possible that digital transformation, at least in discrete manufacturing, has reached an inflection point, as many efforts that were initially well-funded and well-managed are losing focus and momentum.

It might be time to reframe digital transformation discussions and implementations, with a dramatic shift in the focus of digital transformation away from inputs (“new and improved” and “faster, better, cheaper”) into measurable outcomes that fit smoothly into an enterprise’s business plans. This is nothing less than a reorientation of digital transformation and its enablement throughout an organization’s product lifecycle—from concept through end-of-life.

Nearly all industrial companies take digital transformation’s opportunities and implementation challenges very seriously. Management and leadership in a few laggard enterprises, however, still see digital transformation—or digitalization—as just a buzzword, even as they fall further behind their competitors.

A digital transformation tutorial

While it is clear to all within the PLM community that PLM is foundational for a meaningful digitalization strategy, senior leadership does not always understand this truth, leading to a paradox. The investment level in digitalization indicated by my organization’s research seems appropriate, yet success is in jeopardy.

Weak digitalization plans often reflect a need for top executives to be more aware of how much digitalization will potentially impact the jobs and responsibilities of everyone in the organization. Hence, well-thought-out integration and implementation strategies are mandatory. Also needed is the realization that digitalization is dependent on a strong and comprehensive PLM strategy, just as PLM is greatly enabled by digitalization. This synergy enhances the value of the resulting digital transformation initiative.

Fundamentally, digitalization is the next logical step in the ongoing revolution of representing anything and everything in 1s and 0s. This is to say that digitalization is moving from a fuzzy concept to a data-driven derailment of the status quo, including:

  • Transforming products from physical goods into tangible services, the product-as-a-service (PaaS) business model that renders the “product” into data.
  • As information’s importance continues to grow, products and services are increasingly bought and installed for the data they generate or collect.
  • New sources of information are speeding up innovation and product development, adding urgency to digitalization.

It is important to remember that today’s PLM professionals were busy with applications and implementations decades ago when digitalization really was a fuzzy concept—PLM was here first, so to speak. The digital tools that were used to support the product lifecycle, which have now been in use for a few decades, were the beginnings of digitalization. What those tools can now achieve is essential to meeting today’s broad-based enterprise-level digitalization objectives.

What does digitalization mean? Digitalization transforms business models to generate new revenue and value opportunities; it is far more than an analog-to-digital change. If leadership struggles with this, explain it in 1990s terms as “knowledge management.” And make sure digitalization is not seen as merely scanning and digitizing paper documents into images. Images are losing their value and rationale as containers for today’s massive information flows. With the shift from documents to data well underway, containers, in any format, are falling short of the enterprise’s real needs.

Moreover, digitalization is not a one-and-done transformation, which can be demonstrated with a look at the history of recorded music. Analog music formats evolved from 45 RPM records with a single song on each side to vinyl long-playing (LP) media with several songs on each side, followed by cassette tape players, the Sony Walkman, and CDs. Music digitalization arrived as MP3 players with vastly improved sound and hundreds of songs. Now digitalization has taken music online to reach everyone through streaming services.

Likewise, digitalization is the next step in lifecycle management in leveraging existing and future technologies—not the starting point. Enabling digitalization requires end-to-end connectivity, end-to-end lifecycle optimization and sometimes deep changes in the organization and its work culture.

Why points of view matter

The fundamental task in getting anyone to “see” anything is to understand that person’s point of view and how they developed that way of thinking. The challenge with PLM-enabled digitalization comes when senior leaders perceive digitalization as different from and more strategic than PLM, and then allow independent, disconnected implementations. These initiatives are plagued by:

  • Implementation gaps and overlaps
  • Information resource duplications
  • Extensions and integrations that leave some information requirements unsupported
  • Independent and incompatible solution architectures

The results should surprise no one: Time-to-value and ROI are substantially diminished.

Points of view matter. To understand why, it helps to see the digital transformation-centric point of view as outside-in and the PLM-centric point of view as inside-out.

The Outside-in Point of View: Digitalization strategically reconfigures business functions and entities. Image: CIMdata.

The outside-in view sees digitalization as strategically reconfiguring business functions and business entities. Relationships with and between external entities become opportunities for new value propositions, especially in predictive service delivery, sales disintermediation (reducing the number of intermediaries between producers and consumers),and supply chain optimization. With these implementations, the likeliest solution platform is enterprise resource planning (ERP), the go-to toolset for purchasing, supplier relationships, cost management and profit forecasts, among other resource-intensive activities.

The Inside-Out Point or View: PLM is the platform for integrating external entities into lifecycle processes. Image: CIMdata.

From the PLM-centric, inside-out point of view, PLM is the platform for integrating external entities into lifecycle processes. In PLM, external entities become collaborative add-ons to internal lifecycle process flows. In this view, digitalization exploits the myriad application architectures used in product development and throughout the product lifecycle to enable new, high-value business models.

The take-away’s here are two-fold:

The digitalization community must recognize the power and necessity of a fully functional PLM platform—specifically that end-to-end connectivity and optimizing business functions and entities throughout the lifecycle are foundational to realizing their business and digital objectives.

The PLM community must appreciate the digitalization community’s view that strategically reconfiguring business functions and entities (not just “integrating” them) can result in major new business value propositions.

Both communities must be able to see the significant opportunities that lie at the intersection of their two perspectives.

PLM’s place in the enterprise’s digital landscape

PLM-enabled digitalization requires a deeper understanding of PLM’s role in the enterprise’s digital landscape. If a PLM-enabling product innovation platform is a viable way to enable the digital transformation of the lifecycle, it is then fair to ask: What is PLM?

After 40 years of working in and around PLM-enabling technologies, solutions and toolsets, PLM should be viewed as an end-to-end strategic business approach—a highly developed set of business solutions that are both internally and externally consistent and used for:

  • The collaborative creation, use, management and dissemination of product-related intellectual assets. Assets here means all product/plant definition information, i.e.: the virtual product.
  • All product/plant process definitions, including virtual planning, designing, producing, operating, supporting, decommissioning and recycling/disposal.

Understood this way, PLM is far more than an engineering-oriented technology for product development. It’s an innovation platform that supports the extended enterprise and all of its needs from concept through end-of-life.

And bear in mind that innovation takes place not only during the enterprise’s countless transformation processes but also in the very definitions of the organization’s intellectual assets.

Next, we should clearly understand what intellectual assets are: All the components of the enterprise’s product and process definitions. This means all mechanical, electronic, software, formulas, recipes, specifications and documentation components, plus all the business, manufacturing and support process definitions that fall within the scope of the end-to-end lifecycle.

What is meant by “product”? Another sweeping definition: discrete manufactured products, of course—aircraft, cars and trucks, computers and software and medical equipment, as well as mundane things like pills, furniture, hats and coats, shoes and socks, foods and soda pop (canned or otherwise) and on and on.

Fast forward to the 21st century, where companies and their leaders can no longer afford to focus solely on traditional discrete products. Today, the demand spans a wide array of projects and assets, including:

  • Construction projects: Buildings, hospitals, bridges and highways.
  • Processing plants: Oil refineries and offshore drilling platforms.
  • Infrastructure assets and facilities: Airports, railways, distribution systems and their associated equipment.
  • And much more: Spacecraft, weaponry, ships and beyond.

These “products” defined in all their endless variety and adaptations as the organization’s intellectual assets in the endless, i.e., circular, product lifecycles of development, production, marketing and sales and usage, support, upgrades and maintenance plus research for everything that comes next. Within these lifecycles, PLM is tightly linked (and usually integrated) with ERP, SCM, CRM and other enterprise solutions, as well as a myriad of product-definition and creation tools (e.g., computer-aided design, engineering, and so on).

Considering the above, it’s no surprise that information technology (IT) has been divided into two readily distinguishable parts. One part supports deliverable assets in the form of physical products managed with ERP and other similar solutions. The other part supports intellectual assets in the form of virtual products managed with PLM.

In turn, this IT division is leading to the formation of three distinct domains within digital transformation—PLM and ERP, of course, but increasingly execution. Execution is an all-embracing term for getting more competitive and innovative products, systems and assets (physical or virtual) into customers’ hands sooner while lowering overall costs.

And I see refocusing digital transformation efforts toward execution as a way for enterprises to stop punishing their capital structures.

Thus, digital transformation—digitalizationcan yield a truly integrated collaborative environment at the heart of every organization that works to solve all common problems. This, too, is an inflection point.

The post Where and how PLM fits into a digital transformation initiative appeared first on Engineering.com.

]]>
Why Every Enterprise Needs Its Own Digital Twins https://www.engineering.com/why-every-enterprise-needs-its-own-digital-twins/ Fri, 19 Jul 2024 17:20:03 +0000 https://www.engineering.com/?p=52442 There is a new concept emerging in PLM: expanding digital twins from representing products, systems and assets to representing entire enterprises.

The post Why Every Enterprise Needs Its Own Digital Twins appeared first on Engineering.com.

]]>

In this article, I am returning to fundamental questions that often baffle technology users—many times to the surprise of those who earn a living from technology. In an earlier article, I addressed some basic queries: What does PLM actually mean, how do I know if I need it and what questions should be asked to figure that out?

I want to start by taking a step back. These questions may seem inane and answering them may seem unhelpful. Still, they address challenges in comprehensive PLM implementations—the complexity of end-to-end connectivity and their digital threads, webs and networks—whatever they are labeled.

It’s a distressing fact that very few PLM implementations ever reach their ultimate projected goal. The reason is no mystery: implementers often do not push the project through to completion. They don’t “stay the course.”

A big part of the remedy is preventing key people from being peeled away to other major projects.

Big-picture context

The effort to ensure funding and resources for the three or more years needed to implement any major enterprise-class transformation must never be overlooked. The cost of a major transformation could add up to several million dollars—sometimes much more. The usual celebrations of incremental small wins persuade no one unless those wins are placed in a big-picture context.

As I have witnessed again and again, failure to keep everyone focused is the top reason why PLM implementations don’t scale up from the project and the business unit.

Maintaining staff and project management focus is also essential in bringing information technology (IT), operational technology (OT) and engineering technology (ET) together with PLM. While bringing IT, OT and ET together can also be done with enterprise resource planning (ERP) solutions, it’s not recommended, even if the enterprise is supply-chain intensive.

True to its predecessor, material resource planning (MRP), ERP focuses on enterprise resources—primarily money, people, orders, supplies and facilities. But as PLM users know, the enterprise (or a product or an asset) is much more than the sum of its resources and inputs. Digital twins of the enterprise must represent its structure—how its dozens (or hundreds) of moving parts are sustained, enhanced and kept in synch. Thus, a PLM representation of the enterprise focuses on the organizations It encompasses and the activities that support each line of business along with their relationships, internal stresses and all the frictions among them.

Thus, the digital twins of the enterprise are about much more than inputs and resources. For starters, the digital threads of enterprise-scale digital twins must connect to anything and everything that enhances competitiveness and profitability—or that threatens how they work together. Ensuring long-term sustainability in this way requires that any focus on factors external to the digital twins must be matched by a focus on the internals.

As for IT, OT and ET, PLM has increasingly accessible capabilities. These are now truly powerful business platforms that clarify and simplify complex challenges and, most importantly, ensure the timely delivery of world-class products to the marketplace.

This I see as Right-to-Market—the right product, to the right market, at the right time, with the right capabilities, at the right price. Enabled by PLM, Right-to-Market builds enterprise sustainability by ensuring that:

• Even the most competitive rivals can be outperformed.

• Marketplace presence is strengthened with collaboration and innovation that span the enterprise.

• Keeping customers happy is a central focus.

• Builds in long-term profitability for the enterprise—the justification for implementing every large-scale technology.

Right-to-Market is in sync with enterprise-class PLM. This means expanding the implementation of PLM until its digital twins represent the entire enterprise, not just its products and assets. PLM is already in common and profitable use to manage all the individual assets and systems that, along with people, make up the enterprise.

Right-to-Market means completing digital transformation and much more. This ensures that collaboration and innovation work closely with and build on each other and that they mesh smoothly with engineering, production, marketing and service, as well as connecting the enterprise’s IT, OT and ET process and technology environments.

Enterprise-class PLM is the only viable way to achieve this top-level integration, which is bringing IT, OT and ET together. No other solution or technology has the capabilities and resources necessary to support enterprise re-creation on this scale. And only in this way can marketplace rivals be outperformed.

Enterprise-class PLM and the transformations inherent in Right-to-Market lead to smarter evaluations, quicker interpretations of data and information and fewer errors. Enterprise-class PLM helps analysts and decision-makers see the big picture the marketplace presents, to reconcile conflicting viewpoints and to overcome resistance to change—all while silos of information are opened, connected and integrated.  

This brings me to the next question: How do we determine whether operations are big enough, complex enough, or if the company’s products and/or services are sufficiently complex to benefit from PLM?  

Essentially, this means taking an inventory of product, asset and system complexities and their viability. Here are a few useful indicators:

• The enterprise has sufficiently sophisticated products and assets to meet foreseeable customer needs.

• The connectedness and effectiveness of the information processing embedded in products with built-in electronics and supporting software.

• The growing need for products, assets and systems to accommodate change that is abrupt and all-pervasive.

This last point includes new physical, mechanical and materials capabilities; new production and service processes; built-in electronics and embedded information processing; artificial and augmented intelligence; evolving customer expectations and demands; a changing cast of aggressive competitors; and new opportunities in core markets and adjacent segments.

Following this effort, a concerted push may be needed to measure the sophistication of the enterprise’s assets and systems in terms of their competitiveness, profitability and sustainability. Bearing in mind that sophistication and complexity are inseparable, the focus here is:

• Determining and enhancing long-term sustainability and profitability of assets.

• Verifying the soundness of product and asset service lives in terms of marketplace shifts, changes in demand and competitive initiatives.         

• Quickly identifying and exploiting new marketplace opportunities.

• Identifying assets suitable for Product-as-a-Service (PaaS) business models.

• Finding profitable uses for under-utilized capabilities.

• Identifying and disposing of obsolete assets, ending the use of outdated processes and avoiding creeping obsolescence with its potential to ambush business plans.

If these analyses and inventories are done conscientiously, almost every organization will quickly see that it will benefit from implementing PLM or broadening its use.

When digital twins grow to support the enterprise, digital threads link them to everything in its marketplace, imposing huge demands on the breadth and depth of connectivity. On the other hand, less granularity is probably needed than what is customary for a product or asset, keeping the appropriate enterprise digital twins within manageable and usable proportions.

Moreover, digital twins, digital threads and their connectivity change endlessly, which points to a reality of PLM in its ultimate configuration—its need for ongoing support similar to the support invested in IT, ET, or OT.

Strategic business approach

Think about it: while the lifecycle of the enterprise is infinite, everyday product and asset models can quickly mushroom to intimidating proportions, requiring many digital threads and a great variety of connectivity. Hence, the meanings of “end-to-end” and “lifecycle” can change daily.

As every project manager knows, justifications can morph into expectations and then into benchmarks for progress and gauges of success, often with little leeway. Support inevitably wanes without meeting these gauges and measurements or at least acknowledging the constant updates and modifications. As anyone working in technology realizes, the passage of time requires more funding support and more staff resources, not less. This may seem obvious, but I find that the implications are often overlooked.

Change has become abrupt and all-pervasive—new physical, mechanical and materials capabilities; built-in electronics and embedded information processing; artificial and augmented intelligence; evolving customer expectations and demands; a changing cast of aggressive competitors; and new opportunities in core markets and adjacent segments.

As a long-time definer and observer of PLM, I do not doubt the viability of digital twins that are sufficiently robust to accommodate the entire enterprise. This is what I mean by Right-to-Market—ensuring the enterprise is optimally configured to enable and sustain long-term success in its marketplace(s). This ultimately means that the enterprise consistently maximizes its return on investment.

And so, the inevitable technology user’s question: can PLM solution providers’ tools support Right-to-Market? Yes, as evidenced by their nonstop development of new capabilities, simultaneous uptake of new technologies and ongoing accommodations to structural changes in marketplaces.

This is aided by using the Cloud in its broadest sense with the introduction of product-focused and technology-focused platforms easily connected to feed ever-larger and evolving digital twins.

Complexity often hamstrings implementations but getting a handle on it and “taming it” is why we take the risks that always accompany new technologies. Failure to implement exposes us to risks that, over time, can only worsen. And in today’s global marketplaces, basing decisions on “what has always happened” is riskier than ever.

This is why every organization needs to embrace PLM as the strategic business approach it is.

The post Why Every Enterprise Needs Its Own Digital Twins appeared first on Engineering.com.

]]>
From aerospace to appliances, how PLM is tackling costly data issues https://www.engineering.com/from-aerospace-to-appliances-how-plm-is-tackling-costly-data-issues/ Thu, 06 Jun 2024 10:53:00 +0000 https://www.engineering.com/from-aerospace-to-appliances-how-plm-is-tackling-costly-data-issues/ Presenters at the annual CIMdata event zeroed in on how the biggest manufacturers use PLM to manage and streamline onerous amounts of data

The post From aerospace to appliances, how PLM is tackling costly data issues appeared first on Engineering.com.

]]>
PLM Road Map & PDT, a collaboration with Sweden’s Eurostep Group, is CIMdata’s annual North American gathering of product lifecycle management (PLM) professionals. This year’s event was a deep dive into product data and information and how they can be better managed. Presenters zeroed in on one of PLM’s biggest existing challenges—how it plays a critical part in an organization’s digital transformation.

Digital transformation is increasingly recognized as critical to overcoming the data complexities that frustrate every organization. In my opening remarks I stressed that successful digital transformation requires keeping an eye on the evolving trends and enablers—CIMdata’s Critical Dozen.

I also pointed out that:

  • Maximum PLM value will only result from a holistic, end-to-end approach to connectivity, strategies, and tactics provided they appropriately address people, process, and technologies. Nearly all PLM failures result from not following through and not staying the course.
  • Organizational change management will be required; it always plays a critical role in maximizing the adoption of new processes and technologies, and delivering bottom-line value.
  • Evolving customer demands and market opportunities are motivating investment in comprehensive product lifecycle digitalization.

I outlined the major focal points, including enterprise application architectures, configuration management, extending bills of materials (BOMs) through bills of information (BOI) structures, model-based structures, the Internet of Things (IoT), Big Data and analytics, augmented intelligence, digital skills transformation, organizational change management and, of course, digital twins and digital threads.

Each of this year’s presenters, whether from the private or public sector organizations, are hands-on in some aspect of the digital thread. Several presenters were from CIMdata’s Aerospace and Defense PLM Action Group (AD-PAG) member companies, celebrating its tenth year under the leadership of James Roche, CIMdata’s A&D practice director.

Keynoting was David Genter, director of systems design engineering for Cummins Inc.’s Accelera unit, which was formed to leverage design for sustainability (DfS) by “developing a design optimization culture.” Cummins, based in Columbus, Ind., produces more than 1.5 million engine systems a year and is driving a decarbonization transition as it refocuses from diesel engines to alternative-fuel internal combustion engines; fuel-cell, battery-electric, and hybrid powertrains; and electrolyzers that generate hydrogen and oxygen from water.

Genter stressed “moving analysis to the left,” which means using analysis early and often to engineer sustainability into designs from the start. He cited DfS test cases saving more than $1.4M in the first year by removing non-value-added material in the designs of engine mounting support brackets and exhaust manifolds. When a commitment is made to early analysis for material-use optimization, he noted a typical 10% to 15% material savings, a five-fold return on DfS engineers’ salaries and big improvements from right-first-time design results. 

“Addressing climate change is daunting but DfS is not!” Genter observed. Cummins is committed by 2030 to reduce greenhouse gas emissions by 50%, water use by 30%, waste by 25%, organic compound emissions by 50%, new product Scope 3 emissions (indirect but related) by 25%, and generating circular lifecycle plans for every new part.

Optimizing designs with right-first-time techniques can seem “tough” when “everybody is already overwhelmed with the other work,” Genter said, but Cummins has verified that DfS need not add any organizational burdens or stretch normal design times for new products—quite the opposite.

Several presenters addressed the elimination of paper documents. Robert Rencher, a senior systems engineer and associate technical fellow at Boeing, shared some numbers and described a highly successful solution. In his work with the AD-PAG, he has probed airlines’ continuous checks of aircraft and the U.S. Federal Aviation Administration (FAA) Form 8130-3 Authorized Release Certificate (ARC).

Form 8130-3, the FAA’s airworthiness approval tag, is the airline industry’s good-to-go certificate. It is ubiquitous and mandatory for every aircraft inspection and every new or repaired part.

Rencher’s numbers were eye-openers. He reported that a single FAA 8130-3 may require many as 600 engineering and inspection documents, and documentation for a single bolt can add dozens of pages. One large U.S. airline conducts over 200,000 aircraft checks every year, he said, and:

  • 90% of this documentation is still handled on paper.
  • The labor cost to file one document was $20 in 2020.
  • Between 2% and 5% of these documents “are lost or misfiled.”

In his presentation, which was delivered on behalf of the AD PAG and draws on the experiences of all the members, including Boeing and Airbus, Rencher said Airbus can now electronically generate and verify all the info needed for an 8130-3; this is establishing a digital data exchange for aerospace quality certification. Digitizing this documentation could save the aerospace industry €80 million annually in Europe alone, with over €20 million in annual savings already reported.

Rencher summarized that combining the digital thread with distributed ledger technology has proved to be a success: users of 8130-3 gain the digitization, traceability, provenance and accessibility of part data across the part’s product lifecycle from design to final disposition.

Rencher also covered AD-PAG’s benchmark study on furthering the use of PLM’s digital twins and threads. In this effort’s fifth phase, AD-PAG is working with numerous industry standards bodies, Rencher reported. The intent is to gain consensus and acceptance of results, increase the brainpower in the project, increase AD-PAG’s leverage with PLM software providers, and testing PLM digital twin/thread definitions with more than 20 distinct use cases.

AD-PAG’s other focus is on the Systems Modeling Language (SysML) as an enabler of model-based systems engineering (MBSE). In an update, Chris Watkins, principal engineer, MBE/MBSE at Gulfstream Aerospace Corporation, who is the AD-PAG MBSE project leader, reported on open, standardized application program interfaces (APIs), common ontologies, and new SysML tool providers. He highlighted persistent shortcomings in syntax, notations, and interoperability plus ambiguities and “poor support” for Universally Unique IDs that promise to address many challenges. The AD-PAG is addressing these in a follow-on project phase.

McDermott International Ltd. is actively driving the use of PLM in the engineering, procurement and construction (EPC) industry which regularly joins discrete and process capabilities in its multibillion-dollar energy infrastructure projects; these are a tough challenge for any digital technology. Houston-based McDermott designs and builds on land, at sea, and underwater worldwide.

Jeff Stroh, McDermott’s senior director of digital and information management systems, said every McDermott project has millions of documents and digitalization is increasingly urgent. Many documents are from Microsoft Office, but many more are from other digital tools that are not integrated or only partially so—and they reference many sources.

Challenges abound, Stroh stressed. The re-use of prior document content is limited in EPC, and quality controls are “purely manual,” so change identification and management are difficult. EPC projects rely heavily on offline commenting and mark-ups that must be manually processed, he pointed out. Documents and sources are disconnected, so EPCs have no effective way to “bake in” lessons learned.

As yet there are no effective processes for knowledge management amid nonstop additions, deletions, clarifications and replacements, Stroh added. Nor are there any ways to link content from engineers’ applications to a project’s narrative documents.

Stroh, too, had some eye-opening numbers. In EPC, clients supply thousands of documents but McDermott’s engineering deliverables mushroom to many multiples of that; moreover, each individual document goes through multiple revisions and approval cycles. In one project, the client made nearly 60,000 comments on more than 3,000 documents, he noted, all of which had to be responded to.

Project manhours have ballooned in recent decades, he continued, and the EPC business climate has become very risky, as “cost-plus” contracts are gone and now many contracts are done on a fixed price.

To cope, McDermott engineers need data that is available and usable in their tools and applications; data that reduces the friction in work and finding answers; that drives collaboration and visibility across tools, functions, and disciplines; and that augments the workflow, “not interrupt it.” Legacy approaches, “voluminous narrative documentation and manual processes are not fit for the fast-paced world…”

Finally, Stroh commented that this was McDermott’s second PLM try. The first failed because it focused on “document management instead of reimagining processes based on data and information management.” Success in this new implementation is also tied to adopting the users’ terminology into the processes and tools rather than expecting users to adopt the tools’ terminology.

Mabe, a Mexico City-based white-goods manufacturer, presented its efforts to get new products to market faster, extend and improve product families, reduce business complexity and improve productivity across the organization. With eight factories, Mabe annually produces nearly 10 million refrigerators, ranges and other appliance. General Electric Appliances is Mabe’s largest reseller.

Speaking were Gabriel Vargas, director of engineering systems, and Maria Elena Mata Lopez, leader, technical information and modular architecture.

Through digitalization, automation and a modular product family architecture enabling the automation of product configuration, Mabe’s benefits were dramatic, according to Vargas and Mata Lopez. The resources needed for new product introductions fell by 90%, Mata Lopez reported, and for product maintenance 80%, she added; effort to manage BOMs was reduced by 80%; parts counts for ranges fell by 60% and refrigerators 42%. Cost management was speeded up and made accurate, she added.

Digital transformation leaders from General Electric Aerospace, the Gulfstream Aerospace Corp., and Moog also spoke at the conference.

A persuasive case for using artificial intelligence (AI) in product design was made by Uyiosa Abusomwan, senior global technology manager for digital engineering at the Eaton Corporation. He asked whether AI could be used to optimize product design “the way nature optimizes each new creature?”

Addressing the application of AI to supply chains, Abusomwan pointed out that AI can identify features, extract parameters and put them in 3D CAD formats. As an example, he noted, data can be fed into quality management solutions and into PLM so failures and causes are detected and not repeated; Eaton is already doing this.

For AI to be used to optimize designs, he continued, open minds are needed along with new data management tools, which can cut across design solutions including PLM, and an understanding of the values being sought. Abusomwan predicted that third-party solution providers in PLM and IT will be buying up start-up companies that offer these capabilities.

The conference’s second day focused on the public sector with presentations from or about the U.S. Defense Department’s Research and Engineering, the Defense Acquisition University, the Naval Surface Warfare Center, NASA, and the U.K. Ministry of Defence. All addressed digital transformation issues and their agency’s progress.

In a keynote presentation given by Daniel Hettema, director of digital engineering, modeling, and simulation in the Office of the Undersecretary of Defense for Research and Engineering. His office is “getting into systems engineering big time” and has launched a review of the DoD’s modeling and simulation policies.

The DoD, he noted, has 700,000 employees, over 1.4 million contractors, and 709 policies that address some aspect of digital model creation. Because of its size and complexity, “DoD doesn’t make big moves but…we can move the needle with small wins” such as leveraging digital technologies and using (newly) optimized customer models.

Government agency responsibilities are so much broader—and fundamentally different—than any private-sector organization’s responsibilities. So, the public-sector presenters focused on policies and guidance rather than solutions and processes.

Most presenters at this year’s PLM Road Map & PDT zeroed in on the many different challenges in digital transformation and individual companies’ priorities in tackling them through PLM and some form of digital engineering. They all reported solid successes and rapidly evolving plans to complete their transitions.

The post From aerospace to appliances, how PLM is tackling costly data issues appeared first on Engineering.com.

]]>
Augmented Intelligence and Product Lifecycle Management—the Next Frontier https://www.engineering.com/augmented-intelligence-and-product-lifecycle-management-the-next-frontier/ Wed, 01 May 2024 09:32:00 +0000 https://www.engineering.com/augmented-intelligence-and-product-lifecycle-management-the-next-frontier/ A look at the many forms of AI, their differentiators and how to integrate them.

The post Augmented Intelligence and Product Lifecycle Management—the Next Frontier appeared first on Engineering.com.

]]>

A never-ending challenge for every organization, regardless of why it exists, is keeping track of what’s going on in and around it. This includes identifying new and emerging players and business models, of course, but also three sets of drivers of fundamental change, namely new technologies to be mastered and incorporated into new products, the irresistible trends in the marketplace where one competes, and what I call new trends and processes, and functional capabilities (i.e., elements) impacting and necessary for successful digital transformation (i.e., CIMdata’s Critical Dozen).

As I described in previous articles, CIMdata’s Critical Dozen is an evolving set of elements that an organization must master in its digital transformation journey. This means that yesterday’s dozen isn’t necessarily tomorrow’s. This said, two elements (digital twins and digital threads) of the current Critical Dozen have hit their inflection point, forcing their convergence, and a new element—Augmented Intelligence—has emerged.

Digital Twin-Digital Thread Convergence

As I have often stressed, a digital twin without a digital thread is an orphan.  A digital twin, as I described in previous articles, is a virtual representation of a product, a system, a piece of software, a network, and an asset. One or more digital threads keep that representation up to date from the initial concept through its complete lifecycle. 

Digital twins and digital threads are co-dependent, always have been, and always will be. Without a digital thread, a digital twin is little more than database clutter with no guarantee of being up to date. And unconnected to a digital twin, a digital thread is a string of data running from nowhere to nowhere. What has changed is the perception of digital thread and digital twin codependence: it is now clearly and unambiguously understood. 

Artificial Intelligence (AI) powerfully enhances this convergence by extending the breadth, depth, and reach of digital threads while uncovering hidden drivers and causes of changes in digital twins. 

Product Lifecycle Management (PLM) is an innovation engine that helps orchestrate the creation, maintenance, and reuse of assets—digital twins and digital threads, in particular—that represent the enterprise’s products, systems, and networks. Thanks to easier and deeper access to the Internet of Things (IoT), digital threads foster steady enhancements in end-to-end connectivity throughout an asset’s lifecycle. 

The Emergence of AI and its place amongst CIMdata’s Critical Dozen

The IoT is often seen a major cause for the unprecedented explosion of data, structured and unstructured—”Big Data”—generated by the billions of connected devices ranging from automobiles, HVAC systems, medical equipment, digital phones, and even smart doorbells. 

As part of what we understand to be the Fourth Industrial Revolution, these oceans of data have grown beyond human comprehension and even everyday computational capability. In PLM, this data fills our digital twins, races up and down our digital threads (and everywhere else within PLM), and often surges nonstop between PLM and its adjacent enterprise solutions. 

PLM-related AI-enabled applications are now leveraging this, and not a moment too soon. CIMdata wholeheartedly supports the exploration and adoption of AI as the next step in PLM’s central role in digitally transforming the enterprise—and in enabling and realizing how AI is being used to augment human intelligence. This Augmented Intelligence enables human decision-makers and domain experts to use the enormous inflows of data that threaten to overwhelm our digital systems. 

As its name implies, Augmented Intelligence, sometimes labeled Intelligence Augmentation, amplifies human and machine intelligence by merging them (i.e., Augmented Intelligence is the implementation of AI-enabled capabilities that add to and aid human insight and decision-making). In contrast, Artificial Intelligence attempts to supersede human intelligence. This distinction aside, product innovation platforms (i.e., PLM platforms) managing specific processes and data will be key recipients and implementers across organizations. 

The same challenges in any new technology are at work amid AI’s potential productivity gains. The regulatory and legal frameworks around AI technologies, including those that support Augmented Intelligence, already lag their rapid development. Ethical and moral considerations are already emerging, and implementers must develop long-term strategies for using and sharing data. And with everyone’s expectations changing, workforce skills must be upgraded and customers must be educated.

The Many Forms of AI and How They Differ

Decisive action and implementation require understanding the elements of AI. It enables algorithms to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages translations. 

As with any emerging tech trend and process enabler, many different forms quickly appear in the marketplace as soon as the software and hardware are up to the task.

PLM solution providers are busily integrating AI tools and methods, including:

Generative AI (GenAI) is an iterative form of AI that can solve design problems in three steps, making it particularly useful for updating digital models, especially those related to PLM’s digital twins.  (1) The design challenge is first optimized to generate all possible solutions. Each design is then (2) evaluated by simulating its performance, dimensional fits, surfaces, and so on. The third (3) step is algorithmic sorting for the unique design that is highest performing; these designs are often shown as oddly lumpy structures. Integrated with PLM, GenAI promises faster digital threads and better updates to digital twins, among many other potential applications.

Large Language Models (LLMs) generate specific text and images on demand by combining words and untangling the ambiguities in everyday language with vectors (and other tools) that rely on neural network architectures called transformers. LLMs, such as the wildly popular ChatGPT from OpenAI, are “trained” on enormous amounts of data, including text and images. ChatGPT and many other “chatbots” are put to work—”prompted”—with instructions in every day language (NL); usually, no coding skills are required. By the way, “GPT” stands for Generative Pre-trained Transformer. 

Built on machine learning (see below), LLMs make data problems obvious, cutting through the AI hype and taking us back to AI’s roots. As with GenAI above, LLM integration with PLM promises enhanced and faster digital threads and better updates to digital twins, e.g., from the IoT.

AI in Analytics

For many, the historical beating heart of AI is Analytics, explained quite well in the accompanying Gartner chart, “Source Planning Guide for Data and Analytics.”  The graph on the left side of the chart shows the four stages of Analytics—Descriptive, Diagnostic, Predictive, and Prescriptive; the axis labels them from a PLM user’s point of view.

AI-driven analytics can determine an organization’s data’s value, usefulness, and relevance. This is the source of AI’s new data fields that reveal previously unexpected insights, unknown connections, and unseen trends.  

Many analytics AI-enabled approaches may prove beneficial to PLM, but two already stand out:

Machine Learning (ML) makes predictions and recommendations from patterns in data—whether structured or not. Over time, data and information can be added, and ML’s predictive algorithms can be enhanced without explicit programmed instructions.

Deep Learning (DL) is a form of machine learning in which a cascade of processing layers and units extract increasingly complex features from previous outputs; the result is the determination of new data in a hierarchy of concepts using dozens of dimensions of analysis. DL is the basis of our ubiquitous “digital assistants” Siri, Alexa, and Cortana, as well as many complex tasks (such as image processing in autonomous vehicles) that are done rapidly by combining images, audio, and video.  

For PLM users, AI-enabled analytics benefits are found in critical but everyday tasks, including reducing time spent on routine and mundane tasks, avoiding mistakes, achieving more consistent results, and improving response times in:

  • Material selection, manufacturing processes definition, testing and quality     control QC, safety, and accident prevention.
  • System and software modeling with ChatGPT and other LLM tools.
  • Examining digital threads for novel insights and quickly identifying patterns in digital twins.
  • Supporting the management of changes to products and their designs in PLM.
  • Generation of training data for ChatGPT and other LLM tools.

AI in Enterprise Search

In preparing for my presentation for a recent virtual conference sponsored by  Sinequa, a developer of an AI-enabled enterprise search platform, I recognized that many organizations seem to have forgotten that data is the core of digital transformation and that their digital initiatives are doomed unless data is better identified, understood, appreciated, and managed.  

Many often overlook their PLM users’ constant, time-consuming searches for information needed in their jobs; that data and information should be in the digital twins and threads and in the digital webs, processes, and systems where the organization’s data is generated and managed. 

Data generation itself is a problem. The processes, systems, and smart connected products in today’s digital world generate huge amounts of data, threatening to lurch out of control. This makes finding the right data for actionable insights and sound decisions imperative. Big Data has already rendered most ordinary searching ineffective, so the latest capabilities must be provided, and not just to PLM users. 

Leading solution providers like Sinequa also offer and leverage an ever-expanding array of AI-enabled tools and techniques to minimize GenAI’s mistakes and what users label its hallucinations. 

A particularly successful approach that Sinequa leverages is Retrieval Augmented Generation (RAG). By expanding LLMs’ information bases, improving context, and eliminating outdated information, RAG addresses the LLMs’ biggest difficulty—being crammed to overflowing with data. RAG’s back-end information retrieval evolves smaller, specialized LLMs, which can revolutionize enterprise search and overcome traditional information boundaries.

With AI discussions prominent in boardrooms and on Wall Street, we can expect that forthcoming PLM solutions and every industrial product and service will likely have some form of embedded AI. Some recent announcements:

  • A ChatGPT-based “Copilot” key has been added to the keyboards of two new      Microsoft Surface laptops between the keyboard’s arrows and the Alt key. Copilots are an advance over digital assistants in that they are built on LLMs to automate tasks and not just assist with them.
  • Google’s LLM generative AI tool Gemini may be added to Apple iPhones later this year; Gemini is already in Android phones made by Google and others.
  • To help build its version of the industrial metaverse, Siemens Digital Industries Software and NVIDIA Corp., Santa Clara, Calif., are collaborating to add immersive visualization to the Siemens Xcelerator and its Teamcenter X PLM platforms; the goal is to use AI-driven digital twin technology to enhance the visualization of complex data, with “ultra-intuitive” photorealism. NVIDIA has similar links to Ansys, Lenovo, and Rockwell Automation.
  • To improve decisions based on PLM data and to boost supply chain productivity, data analysis, and sustainability workflows, more than 50 generative AI features have been added to the Oracle Fusion Cloud.
  • Predictive maintenance aims to reduce equipment failures by continuously monitoring condition and performance via text, video, imaging, etc., using ML, DL, and access to the IIoT.

                                     

The future of AI and PLM

As the preceding examples show, opportunities unlocked by AI-enabled solutions will be transformative and even revolutionary. Bringing PLM and AI together will empower human creativity with enhanced abilities to turn ideas into realities. The resulting new products, services, and business models will ensure the entire enterprise’s long-term sustainability. 

CIMdata foresees that AI in digital transformation will further strengthen the PLM innovation engine and extend its use, widening the use of AI and broadening demand for it. AI in PLM will prove invaluable in helping us to quickly discover “what we know,” i.e., what we have and can find in our data, along with revealing what we still need (or can’t find), comprehending the real value of our existing content, and finding new ways to generate what we need—thereby augmenting human intelligence in new and improved ways.

If businesses now run on data, what has changed?

Decisions are now based on examining data and poring through what appears to be endless streams of data, data that is assumed to be good. “Good” in this context means accurate and complete, which we all know data never is. Gone is the time-honored basing of decisions on direct observation of designs, production, sales, field service, and customer feedback. The few experienced and long-serving employees and managers still on the job are rapidly being displaced by data processed in many new (digital) ways and governed by poorly understood “standards.” 

In other words, running businesses based on “experience” and direct observation is being augmented with new, powerful AI-enabled tools. As part of the Fourth Industrial Revolution, digital transformation—and so many retirements—business decisions are now based almost entirely on data. The overriding concern is whether AI can improve data quality before it is too late.

In an April 2024 Scientific American article titled A Truly Intelligent Machine, author George Musser explains why AI is so compelling: “We now realize that tasks we find easy, such as visual recognition, are computationally demanding, and the things we (humans) find hard, such as math and chess, are really the easy ones.”

The promises and benefits of bringing AI-enabled augmented intelligence into PLM are so huge that the consequences of not doing so are intimidating. But, incorporating Large Language Models into PLM is still in the initial stages while, at the same time, many changes are coming in AI. The PLM transformation will likely be long and bumpy. 

The post Augmented Intelligence and Product Lifecycle Management—the Next Frontier appeared first on Engineering.com.

]]>
Answering 3 top PLM questions https://www.engineering.com/answering-3-top-plm-questions/ Fri, 22 Mar 2024 10:35:00 +0000 https://www.engineering.com/answering-3-top-plm-questions/ Going back to basics to help users understand PLM.

The post Answering 3 top PLM questions appeared first on Engineering.com.

]]>
In this PLM article, Peter Bilello, president and CEO of CIMdata, draws inspiration from Answerthepublic.com, a marketing-focused platform that helps people find common user questions entered into search engines.


(Stock image.)

(Stock image.)

I will answer three basic questions from the platform answerthepublic.com:

  1. What does PLM actually mean?
  2. How do I know if I need PLM?  
  3. What should I ask to understand if my company needs PLM?

Two of the main reasons for PLM failure are a lack of awareness and misunderstanding of it. My hope with this article is to minimize and address both head-on.

For context, let’s first explore CIMdata’s formal definition of PLM. It is “a strategic business approach that applies a consistent set of business solutions in support of the collaborative creation, management, dissemination and use of product definition information across the extended enterprise and spanning from product concept to end of life — integrating people, processes, business systems and information.”

Let’s then expand on this concept with insight from my recent webinar, “AI & PLM: Beyond All the Hype.” In the webinar, I stressed that PLM is a company’s innovation engine orchestrating the creation, maintenance and reuse of product-related digital assets.

For software developers and service providers, these definitions sum up PLM pretty well. But as a PLM consultant, researcher and educator, I realize that users need better insights — if only to help justify a purchase. By leveraging Answerthepublic.com we can see gaps in understanding and present our insights from a crucial, yet almost always overlooked, viewpoint: the user.

Q1.  What does PLM actually mean?

To help contextualize PLM, I find it useful to spell out five key lifecycle elements of PLM and three sources of the connectivity it must provide. The five elements are: 

  1. Customer insight: Figuring out what customers want and need by monitoring products in use, assessing customer acceptance or annoyance and measuring the utility of new features by remotely monitoring their use.
  2. Concepts and detailed design: Robust architectural concepts and detailed design based on user feedback joined with design considerations, manufacturing capabilities, maintenance and repair feedback.
  3. Testing: Monitoring the performance of prototype systems, bench-testing subsystems and optimizing performance and management via proving grounds and testing labs.
  4. Visibility in manufacturing: Readily available feedback on manufacturing defects, supplier quality control, supply chain logistics and OEM or supplier operations.
  5. After-sales service via feedback loops: Field defects, success and failure in diagnostics and repair, product health and scheduling of maintenance. This includes the use of new products and specific features.

Users will recognize these five elements as the major sources of information that must be fed into a PLM’s digital twins — the virtual representations of the enterprise’s products, systems, assets and associated process definitions. Meanwhile, the three sources of data, information, knowledge and intelligence that feed into these five PLM elements include: 

  1. Applications that help develop insights, root-cause analyses, planning decisions, their execution and control strategies.
  2. Information for use with data storage, search/find, analysis and development of new applications.
  3. Data from edge computations. This includes sensors, devices and on-board data-processing and IT systems.

Users will recognize this trio as the information fed into their digital twins, gathered by PLM’s digital threads. These threads represent the myriad of links up and down a product’s lifecycle and the data pipeline that enables connectivity and information gathering.

Q2: How do I know if I need PLM?

The characteristic strengths of PLM — the collaboration and innovation enabled by its processes and data management platforms — are needed when:

  • Lifecycles of products, systems and services must be tracked for reasons including effective field maintenance, resolving customer complaints and pinpointing innovations for upcoming products.
  • The solutions to join, track and collaborate innovation effectively are unavailable. PLM’s “rivals” extend the capabilities of their offerings by adding toolsets to address this, but they are not always well-integrated and often have limited functionality. In PLM, these capabilities are native.
  • Products, systems and services grow in complexity with user demands and the accelerating rates of market changes. This makes it essential to track the lessons learned from users in the field, design, development and manufacturing.
  • Product becomes a data-driven service via the ongoing transformation of conventional, physical products, systems and services. Many factory equipment and other product providers now rely on a product-as-a-service (PaaS) business model: leasing their products to users while maintaining them and invoicing based on usage.

Q3. What should I ask to understand if my company needs PLM?

Some questions to keep in mind when considering PLM include:

  • Do your products and assets require multiple disciplines? No development application or toolset — electrical/electronic, mechanical or software development — can adequately track changes made by one discipline, unless all disciplines are appropriately integrated within a PLM solution providers’ product innovation platform. And if you have multiple platforms, it is time to rethink the need for PLM.
  • Are your product designs so complex that you need multiple computer-aided design (CAD) tools? Does using multiple CAD tools lead to siloed data, workflows and information? Those CAD tools may or may not communicate well, let alone enable the necessary collaboration, integration and tradeoff optimization needed to design today’s complex products. PLM, however, can address these needs.
  • Do you have varying relationships with suppliers? For instance, do some build to print, just sell to you, support production, service your finished products or share in your product development? The complexity here can disrupt the collaboration and integration needed to create the competitive products that sustain your organization. PLM can make sense of this complexity. 
  • Does your organization develop, manufacture and support products across multiple sites? Or, put another way, does your organization seek to design anywhere, build anywhere and support everywhere? This means product personnel are scattered geographically and probably have different backgrounds and skill sets. If so, these coworkers need to collaborate among themselves and integrate their work. PLM can help verify that this is done. 
  • Are your products configurable and are sales, build and usage options included in development? Here, complexities arise regarding what is and is not configurable and which options users are given. Configurability often grows in response to user demands, but regardless of its cause, single-purpose toolsets and overly focused/restrictive applications will always struggle with it, while PLM can shine.

Complexity is far from the only challenge experienced during a product’s lifecycle—from concept through life. CIMdata always asks its clients what they are doing, or plan to do, if:

  • Margins and selling prices are under competitive pressure.
  • Product launches are delayed.
  • Warranty claims and recalls are on the rise.

Each can have several causes. Sorting them out and addressing them requires effective collaboration and the integration tools found in PLM’s digital twins, digital threads and connectivity. These fundamental capabilities enable collaboration and integration throughout a product’s lifecycle. Their mutual dependencies address complexity challenges by continually building on each other. 

Running through this narrative are the themes of collaboration and innovation. These are processes PLM supports like no other technology can. Without PLM-enabled collaboration and innovation, the gathering and management of information, insights and inspiration will flounder. Only collaboration and innovation can ensure the long-term sustainability of an enterprise.

The post Answering 3 top PLM questions appeared first on Engineering.com.

]]>
What’s the difference between PLM, ERP, EAM and more? https://www.engineering.com/whats-the-difference-between-plm-erp-eam-and-more/ Fri, 16 Feb 2024 12:57:00 +0000 https://www.engineering.com/whats-the-difference-between-plm-erp-eam-and-more/ And why is PLM the best of the available options.

The post What’s the difference between PLM, ERP, EAM and more? appeared first on Engineering.com.

]]>
It is an irony of technology that the solutions, systems and platforms that seek to drive business benefits often describe themselves and their focus in strikingly similar terms. The descriptions and sales demonstrations presented to new prospects by many solution providers seem all-encompassing. In fact, without deep-dives into the nuances of their messages, solutions and associated capabilities, they become nearly indistinguishable—akin to clones.

A little PLM 101 is needed to compare it with its alternatives. (Image: Bigstock.)

A little PLM 101 is needed to compare it with its alternatives. (Image: Bigstock.)

The main source of this confusion is that the enterprise-level solution providers edge closer to each other’s capabilities with each release. In their pursuit to be competitive, their offerings have become so feature and function-rich that many look alike. Even so, they constantly add new capabilities in support of their core competencies and focused domains—which are dearly loved by the users they enable and empower.

This article looks at two enterprise-level solutions that are often confused—Enterprise Resource Planning (ERP) and Product Lifecycle Management (PLM). They stand out as the two most prominent digital solutions for managing data at the enterprise level. Proponents of both have been touting the benefits of their solutions for a few decades while rarely addressing each other, let alone lining them up side by side.

While briefly discussing four other enterprise-level solutions: customer relationship management (CRM), enterprise asset management (EAM), building information management (BIM), and manufacturing execution systems (MES), I am mainly comparing ERP and PLM. Or what I call the “big two.”

I will share what users of both ERP and PLM have told CIMdata. The hope is that their observations will speak for themselves. I will also explain why PLM is better from the standpoint of defining and managing the complete lifecycle, product data and associated lifecycle processes.

PLM, ERP and how they started to overlap each other

At first glance, ERP and PLM may not seem that different — but they are. Their differences emerge when we look at their roots and core functions. ERP mainly grew out of the need to manage finances and manufacturing (i.e., the management of physical assets), while PLM primarily grew out of the need to manage product development (i.e., the definition and management of product-related intellectual assets).

Each of these solutions emerged in the 1980s from the digitization and computerization required to manage an enterprise’s ever-evolving data and process needs. Both product and organizational complexities continue to drive the adoption of these and many more digital technologies.

Over the ensuing decades, these solutions steadily evolved, growing in features and functions as they were implemented at ever higher levels of the enterprise. Each solution matured to enable key elements of Information Technology (IT), Engineering Technology (ET) and Operations Technology (OT).

Growth-driven solution providers of ERP and PLM inevitably began to see their peers as rivals as the boundaries between the front end of the product lifecycle and its production side blurred and overlapped. Both solution camps also support similar data constructs and processes (e.g., parts, BOMs as well as engineering change and release management). Soon, new features and functions began to mimic each other, giving rise to today’s confusion that surrounds digital realms at the top of every enterprise.

Back to the roots of enterprise solutions

Different enterprise-level solutions begin, sensibly enough, with different digital roots. It is the development directions taken that led to their similarities. Here is how many different enterprise tools had their start.

  • ERP came to life as toolsets designed to track and manage finances, inventories and generate BOMs for purchasing and production. An offshoot of MRP, or materials requirements planning (or “processing”), ERP steadily expanded digital record keeping into “running the business.” ERP began as a way to keep key managers aware of everything impacting the bottom line; forecasting trends and automating these functions have progressed steadily ever since. The primary focus of ERP is on the management of an organization’s physical products (i.e., a physical asset-centric approach).
  • PLM is built on the basic connectivity of transforming data and ideas into information within product development (i.e., an intellectual asset-centric approach), which then (literally) drove the tools and systems of production and service. As this connectivity expanded, the need for structures and repositories became apparent, giving birth to PLM’s support of digital threads and digital twins. These elements, among others, distinguish PLM from its predecessor, product data management (PDM).
  • CRM originated as a toolset for tracking customer contacts through various channels, including websites, mailings and telephone calls. It evolved with the advent of social media to manage all external customer interactions to grow a base of repeat customers with better assessments of their needs and faster responses. Some CRM solutions even have PLM solutions built on top of their platforms.
  • EAM evolved from computerized maintenance management systems (CMMs) into a lifecycle management approach for monitoring and supporting the well-being and performance of maintained assets from acquisition/commissioning to the end of their productive life (i.e., in-service asset-centric). While more narrowly focused, CMMs centralize, upgrade and automate maintenance management information. Today, many PLM and ERP solutions provide EAM support.
  • BIM emerged from the architecture, engineering and construction (AEC) branch of CAD, and focuses on building and maintaining infrastructures (e.g., buildings, rail and utility networks). In the private sector, BIM is universally used for large and small buildings. In the public sector, BIM addresses the design and management of data associated with complex projects and assets such as roads, bridges, utility plants and distribution networks, pipelines and dams for federal, state and local governments. Users also rely on BIM to manage complex relationships between owners/operators, contractors and project completion/hand-over. BIM can be thought of as a domain-specific PLM solution.
  • MES was developed to monitor and document real-time operations, turning raw materials into physical components and finished goods. MES tells decision-makers what is and is not happening in production, highlighting needed improvements and helping optimize production. MES is often used for intermediate data management between factory floor supervisory control and data acquisition (SCADA) systems and ERP.

As CRM, EAM, BIM and MES matured, solution providers added connectivity for frequently accessed third-party toolsets and platforms, starting with design and analysis basics, as well as simplified access to the Internet of Things (IoT). Along with the automation of everything and anything, toolsets (integrated or third-party) are now used for simulation and analysis along with computer-aided process planning and work instructions. Examples of these simulation, analysis and planning tools include metrology, testing, inspection; real-time video with augmented reality and virtual reality (AR/VR); data governance/configuration management, machine learning, predictive analytics, artificial intelligence (AI) and more.

To some degree, all are integrated with or are components of a Model-Based Enterprise (MBE) approach. Prompted by PLM’s success, many are developing rudimentary digital twins and digital threads, or are positioning themselves so that they enable and/or support these digital constructs.

What remains unchanged is the fundamental distinction between ERP and PLM users. ERP is the primary enterprise toolset for people with business or purchasing backgrounds focused on profitably “running” the business, emphasizing the management of the physical product. PLM is primarily used by engineers and others with technical backgrounds who develop and support the products, systems and assets that contribute to an enterprise’s competitiveness, emphasizing the virtual product.

Core functions and expansions of today’s enterprise solutions

With the incorporation of the additions and advances mentioned above, and thanks to its end-to-end bidirectional lifecycle connectivity, PLM has matured into an enterprise-spanning platform. The other five solutions, while providing value, lack PLM’s intellectual asset reach and robustness. Also, it is unlikely they will keep their (more limited) effectiveness as PLM solution providers invest in “immersive engineering” and shift their support of products and services into the support of an evolving digital “metaverse.”

PLM is rapidly being implemented across diverse industries—food and beverage, retailing, fashion, banking, insurance, packaging and distribution, media, transportation, pharmaceuticals, healthcare services and more. While many of these industries generate physical products, they are overshadowed by information gathering, simulation, analysis, compliance verification and feedback. Much of this cannot be handled by other enterprise platforms. Mostly, this is accomplished through modifications and add-ons rather than (usually troublesome) customizations.

ERP has also matured to span the enterprise and is now a platform to monitor, in near real-time, every action, change and decision that impacts the bottom line. It links finance, back-office functions, sales and purchasing and inventory management with sales, marketing, distribution, human resources, factory operations and suppliers. Even if they are not directly managing them, ERP systems facilitate E-commerce, security, privacy, risk management, remote work, energy consumption, environmental sustainability and various Industry 4.0 initiatives.

Interestingly, the terms “collaboration” and “communication,” ubiquitous in PLM, remain scarce in ERP self-descriptions. Presumably, because ERP remains true to its roots in tracking costs and quantities, leaving product development functions, related intellectual asset creation and management capabilities in the hands of others.

EAM has steadily evolved from preventive to predictive approaches for asset tracking and management. These can cover all phases of maintenance, including scheduling, planning, work sequencing/management, mobility, analytics, health and safety and even supply chains. While EAM is used in many manufacturing and service organizations, it is more often found in energy production and distribution, oil, gas, nuclear power, utilities, mining and chemical production. For asset-management businesses, EAM can serve as both PLM and ERP. As noted previously, PLM solutions are successfully eating away at this enterprise solution domain in many organizations.

CRM aggregates customer information to access current and prior contacts (sales and service), purchase history and updates and performance tracking. As with other enterprise solutions, CRM automates the sharing of customer interactions so business units and teams can work together smoothly. Leading CRM solution providers have begun integrating AI into their offerings. CRM has matured enough to be used as both PLM and ERP in companies intensely focused on sales and customer relations, such as dealerships.

BIM has also matured somewhat into both PLM and ERP for buildings, infrastructure and floorplans; monitoring on-the-job equipment; the transfer of work from job sites to factories and even the weather. BIM users can scan and find virtually anything using drones and innovative imaging technologies such as Lidar (light detection and ranging). As for connectivity and access, BIM users say it’s good now, but they never have enough of either to deal with local, county and state building codes and environmental regulations. In the public sector and civil engineering, BIM users monitor and manage projects as well as changes in specifications, budgets and completion dates.

MES has reached a level of maturity where it monitors and controls production inputs, personnel, machine uptime and downtime, support services and scheduling while tracking their effectiveness. The track-and-trace capabilities of MES create “as-built” forms of products and production systems include additive manufacturing (3D printing), material handling, robotics, automated assembly and new processes for advanced materials. This is particularly valued in regulated industries requiring process documentation, problem tracking, and implementing corrective actions. Many job shops and even some large machining companies use MES to “run the business,” not just manage production operations and outcomes.

How PLM and ERP remain different

The time has come to shift our focus from technology to people. ERP and PLM powerfully support and enable the information needs of their users, but the tasks and responsibilities, along with the required skillsets, differ significantly. The data and information underlying ERP and PLM constantly change, often abruptly and in unanticipated ways; both handle these changes well but accommodate them differently.

ERP’s dollars-and-cents focus demands skills in finance and accounting. Many of its users also have purchasing backgrounds. These users have ongoing needs for a narrow range of data and mostly repetitive information. These needs are logical given that ERP users focus on the here-and-now to track dollar-denominated costs, forecasting profits and projecting them into the near future.

PLM’s emphasis on modeling, analysis and communications demands in-depth skills for conceptualizing and developing viable new products, systems and assets—guiding them through production and into users’ hands, all while maintaining them. PLM users work with much more diverse information sources and types and have far greater needs for graphics than ERP users, whose tools and tasks are predominantly numbers-based.

Compared to ERP users, PLM users interact daily with a wide range of design tools, simulations, analysis systems, information types and formats. In contrast to ERP environments, PLM information and decisions are shared across the extended enterprise, augmented by continuous feedback. This has led to many PLM solutions being architected for integration and openness—realizing that they provide a data and process management platform that supports the easy integration of third-party and enterprise-specific applications configured and changed over time.

Tasked with keeping costs within budgets, ERP users primarily focus on the enterprise’s biggest cost drivers—purchasing, production and sales. Managing design and development, with its countless detailed component descriptions, work instructions and inspection criteria for every production phase, is much easier in PLM. So is accommodating the MBE with its Model-Based Design and Model-Based Systems Engineering elements that align with the organization’s overall digital aspirations.

Users leverage PLM’s collaboration and information access across large spans of the enterprise while fostering the digital transformation of the organization’s intellectual assets and the processes that create, manage and use them. Compared with ERP users, PLM users need far more detailed representations of intermediate stages, analyses and decisions as products evolve from ideation through engineering, production, sales and service.

Can innovative, competitive new products be generated in ERP? I must admit, I have never heard of them. Undoubtedly, some have. ERP’s top-down approach is essential for management control over as much of the enterprise as possible. This inherently makes it backward-looking with a focus on identifying and evaluating past results.

The PLM advantage is that its focus is on innovation, intellectual asset creation and management. This fundamentally forward-looking framework makes PLM non-hierarchical, collaborative, structured and effective.

Again, I have never heard of engineers innovating and collaborating using ERP. Why not? Because ERP tends to be a top-down approach to foster management “control.” PLM’s strength is that being inherently more collaborative and less hierarchical will always be preferred by engineering and product development. Fundamentally, ERP focuses on an organization’s physical assets, whereas PLM focuses on its intellectual assets. As a result, a company needs to digitally enable its business with the appropriate solutions for the task at hand and not try to utilize a solution that wasn’t designed for the purpose being addressed. Doing so should be deemed a non-starter for all.

Further reading

This article is the culmination of my latest series of PLM-focused articles developed for Engineering.com:

The post What’s the difference between PLM, ERP, EAM and more? appeared first on Engineering.com.

]]>
Supply chain resilience through PLM integration https://www.engineering.com/supply-chain-resilience-through-plm-integration/ Fri, 05 Jan 2024 09:23:00 +0000 https://www.engineering.com/supply-chain-resilience-through-plm-integration/ How to become supply chain resilient and what it will look like.

The post Supply chain resilience through PLM integration appeared first on Engineering.com.

]]>
Given the steady increase in disruptions hitting our enterprises and their rising costs, it is time for us to recognize that our digital solutions often pile capabilities on top of one another without considering the ability of our organizations and enterprises to combat and recover from disruptions.

Disruptions are a daily part of life—often serving as the catalyst for innovation or rapid unplanned change. In business, disruptions range from malicious hacks, sudden shortages, logistics problems and climate change. When taken in aggregate, they are as constant and varied as the surprises in life itself.

 (Image: Bigstock).

(Image: Bigstock).

As our understanding of disruptions grows, it becomes evident that resiliency—the ability to overcome and even benefit from disruptions—is a critical strength for enterprises. Until recently, this was not always appreciated. However, this view is now shared by Washington, where the inaugural meeting of the White House Council on Supply Chain Resilience took place in late November 2023.

This council, which grew out of the 2021 Supply Chain Disruptions Task Force, announced thirty new actions that will span across 19 federal agencies including Commerce, Transportation, Defense and Homeland Security.

The idea of using software-enabled business solutions to address anything so universally entrenched and unpredictable as disruption may, at first, seem ludicrous. But even some of the largest problems can be remedied if broken into manageable pieces. This perspective is equally applicable to disruptions in supply chains, prompting my focus on the increasingly urgent need to integrate supplier data into enterprise product lifecycle management (PLM) software and strategies.

What is Resiliency and Who can Tackle it?

Resiliency is the risk-tolerant capabilities needed to recover, or seize the unexpected opportunity, from disruptions. However, this definition only brings more questions:

  • Can resiliency be measured?
  • How much does it cost?
  • What resources, and how much of them, should be allocated to build resiliency?
  • When do we have enough resiliency to foresee and recover from obvious disruptions?
  • Do we even know what ‘enough’ means?
  • What about the disruptions we don’t foresee?
  • How do we know if we have recovered from a disruption: how would that be measured?

Remember that disruptions tend to spread rapidly in lean, tightly integrated companies so answering these questions now, when things are stable, is imperative.

Building an organization’s resiliency is a daunting challenge and requires a proactive approach, so the sound and sensible first step is to set up a resiliency task force. I urge the task force to focus initially on the supply chain, where disruptions worsen. 

Supply chain disruptions range from obvious to obscure, from losing a key supplier due to a natural disaster. As we have seen in microchips, lithium ores and so much else, disruptions underscore the imperative of embedding resiliency into the entire enterprise. 

Some sources of disruption include:

  • Hacking in all its malicious forms.
  • Sudden unavailability of key parts, halting production or forcing redesigns.
  • Abrupt changes in customer demand and financial health.
  • Unexpected sharp price hikes and inflation.
  • Transportation problems with ships, trucks and freight trains.
  • Just-in-time (JIT) inventory management that prioritizes minimal inventories and costs.
  • Over-preventing shortages.
  • Bad priorities like an overwhelming focus on cost savings.
  • Climate change and geography which affects flooding, tornados, snowstorms, droughts and more.

Can an organization really foresee disruptions? Yes, for the most part, by asking classic journalistic questions, like:

  • Who or What will trigger our next disruption and the ones after that?
  • Where will these disruptions come from?
  • When will they strike?
  • Why, as in what underlying events will generate disruptions?
  •  How can the costs of disruption be calculated? And can recovery be assessed in some better way than the rear-view mirror?

The key to answering many of these questions is hidden within PLM solutions.

Supply Chain Resilience is an Enterprise-Wide Project

Given the frequency and extent of disruptions, JIT supply chains and least-cost inventory management must be reconsidered in the broader context of optimizing the end-to-end product lifecycle. In other words, supply chains, the bills of materials (BOMs) they propagate and their users can benefit from PLM’s digital twins, digital threads, end-to-end bidirectional lifecycle connectivity and big data analytics. Purchasing and procurement departments — that build and manage supply chains — can also benefit if they are integrated into an enterprise’s overall PLM environment.

This PLM integration will necessitate that supply chains are less tightly controlled by finance and enterprise resource planning (ERP) systems. This is beneficial as some traditional systems are too narrowly focused on transactions instead of the what-if analyses required to swiftly identify, resolve and minimize disruption.

In many cases disruptions are vague. In these instances, PLM’s digital twins will often have nothing physical to represent. As a result, their numerous digital threads won’t have any data to connect. Nevertheless, I advocate switching from a reactive to a proactive stance because some disruptions can be recognized in advance and thus prevented or mitigated. Carefully focused PLM strategies and their associated digital solutions can enable decisive action when early indicators arise.

Part of this proactive approach includes the realization that disruptions can strike anywhere, at any time, requiring resiliency to permeate the entire enterprise. Building this resiliency requires close collaboration between PLM managers, security, IT, top management, finance, and purchasing and procurement.

PLM’s Role in Addressing Disruptions and Building Resiliency

For years, we have been hearing about worsening supply chain difficulties and sudden shortages; these disruptions seem to be the new normal. On the brighter side, PLM-integrated supply chains can do a better job of supporting product development and all the other enterprise functions that rely on external suppliers. This will hopefully reduce the risk of sudden halts in the production of new and old products, systems or assets.

Integration with PLM not only enhances supplier collaboration but also provides a single source of truth for identifying and managing key suppliers. This helps the enterprise better understand and track supplier capabilities, quality, performance and risks (e.g., reliance on single-sourced parts, components and systems).

This integration emphasizes the need to bring supply-chain management into PLM environments. It is critical for ensuring resiliency as disruptions can strike any part of the organization, from human resources to the executive suite. And the more tightly integrated the organization, the greater the pain of disruptions. This is why purchasing, for example, must no longer be a support function, stand-alone department or other organizational stepchild. Everything must be connected.

Unmatched by any other strategic and holistic business approach, PLM and its enabling digital solutions provide users with enhanced visibility up and down supply chains and into their data. This includes purchase orders, BOMs, Bills of Information (BOIs) and more data that is generated outside the organization from suppliers, contractors and partners.

Extending PLM into the supply chain and the departments that handle purchasing and logistics will not magically confer resiliency or make disruptions disappear, but it will provide organizations with more resources to handle them. Security experts will need to identify likely sources of disruptions, other worrisome vectors, and how peers have recovered. IT experts then need to code these findings into the PLM environment for reference. Top management must ensure that staff and funds are available to address these disruptions and to understand that resiliency will never be “done.” 

Additional benefits of moving the supply chain into the PLM strategy and environment include facilitating easier access for all users and fostering better collaboration within and between engineering, production, services, partners and suppliers.

What a PLM-Enabled Supply Chain Will Look Like

PLM-enabled visibility helps the organization reduce risks and minimize disruptions, making it feasible to connect BOMs and specifications with the broader supply chain and the entire enterprise—from production, development and engineering to operations, sales, service and so on. This connection must include a product’s initial concept and ideation through to the end of its useful life.

This visibility, made possible by PLM, will dispel misconceptions that supply chains are quasi-independent and stand-alone or that purchasing is responsible only for inventory control and answer only to finance.

In addition, the overall enterprise and everyone in it gains visibility into managing and optimizing the product lifecycle. They also gain support to proactively assess and evaluate risk via alternate components, new suppliers and more. Users also gain decision-support as to how to act when disruptions occur.

As time passes, PLM extensions and additional implementations will play vital roles in building an enterprise’s supply chain resiliency and recovering from the disruptions everyone knows are ahead—it is just a matter of time. I cannot overstate the potential costs of not bringing the supply chain into PLM.

The post Supply chain resilience through PLM integration appeared first on Engineering.com.

]]>
The Role of Top Leadership Amid PLM and Data Management https://www.engineering.com/the-role-of-top-leadership-amid-plm-and-data-management/ Thu, 30 Nov 2023 13:46:00 +0000 https://www.engineering.com/the-role-of-top-leadership-amid-plm-and-data-management/ Five tips to stop the surge of industry data and information.

The post The Role of Top Leadership Amid PLM and Data Management appeared first on Engineering.com.

]]>
People at the top of their enterprise have a growing, if sometimes unrecognized, responsibility to ensure their business’ data is managed properly. In the past, this management of information was often left to business units and even individuals. As such, leaders may not be up to speed on what they need to know.

(Image: Bigstock.)

(Image: Bigstock.)

For instance, leaders—and many articles targeting their demographic—use the phrase “data and information” as if they are synonymous. But they are not, and the distinction is quite significant. And, yes, I have been guilty of this error in syntax at times when I have not been careful.

The phrase “data and information” is misleadingly broad; the innocuous conjunction “and” may conceal more than it reveals. My view of data and information aligns with many in the data management world. In summary, data is a set of facts and information adds context to said facts. This distinction means that data is unorganized, while information brings order to that data. When placed together in context, information helps to map a bigger view of what that data means and what it is telling you.

Data and information can also be understood as alternating between these phases. All data has had a previous life as information for someone somewhere, then once its context was forgotten it reverted back into data. If this doesn’t underline the need for reinforcing data management, I do not know what would.

Formats and processes change and evolve constantly, forcing data to become information in varied ways. These changes and evolutions gradually undermine the processes and practices we use to manage our data. With every passing day, they become less effective.

Usually, no one notices until a mishap comes to light. Perhaps a decision or analysis based on data slipped under management’s radar, making it unreliable or a liability in disguise. Steering clear of potential disasters is a prime responsibility of top management. This article extends that responsibility to ensuring sustainable and effective management of data, as well as the role PLM can take in support of this critical enterprise task.

The Data Challenges Ahead for Top Management

Leaders at the top of their organizations must deal with some unsettling realities, including:

  • The numerous operating systems that choke task- and process-oriented data repositories.
  • The various formats, task-linked processes and apps that convert data into information that is unusable by other tools.
  • The amount of information that can become redundant, obsolete, or trivial.
  • The large percentage of data that is stored but never looked at or processed.
  • The vast majority of business information that is unstructured and difficult to search, and therefore leveraged.
  • The explosion of data and information, with zettabytes stored in the Cloud.
  • Artificial intelligence (AI) adding to the explosion in enterprise data.

These facts warn us that just because data is in a repository (on-premises or in the Cloud) does not mean data is under effective management. The fact that effective data management is slipping away is a compelling case for the responsibility of enterprise leadership to see to the reinforcement of sound data-handling practices.

A Digital Tool Can’t Solve Everything, Including Data Management

The reinforcement of data management extends beyond digital tools. Effective data access, use and reuse is a management responsibility, requiring solution providers, consultants and internal IT management to be held accountable for users’ ongoing concerns. This means developing a hard-nosed focus on failures to address recurring data problems and identify, improve and/or remove any software or process shortfalls—regardless of the guilty party.

It also means recognizing that information management, for easy collaboration and comprehension, underlies everything the enterprise accomplishes. So that nothing will be overlooked, I recommend that reinforcement take a structured approach by applying CIMdata’s “five V’s” to address data surges in enterprise data repositories. They are:

  1. Volume of data should be addressed by reducing or constraining the highest inflows from engineering, production systems, smart devices and AI.
  2. Variety of data can be sharply reduced by focusing on needed information, instead of what may be needed.
  3. Velocity, defined as the rate at which data accumulates, can be slowed by blocking anything too peripheral to keep—like noisy data.
  4. Veracity reinforces data management by continually re-establishing the value, accuracy and completeness of all inbound data.
  5. Verification creates mandatory periodic check-ups on the preceding four V’s.

If these five V’s sound like data governance, this is no coincidence. But they are also designed specifically for the organization and implementation of policies, procedures, structures, roles and responsibilities that outline and enforce rules of engagement, decision rights and accountabilities.

The Role of PLM and Change Management in Data Management

For reinforcing effective management of data, an end-to-end PLM approach is unmatched. Its connections to digital twins, end-to-end connectivity and digital threads point to strategies and tactics top management can use as they prod and demand the enterprise’s many business units to establish effective data management.

Reinforcing data management, with or without enabling an appropriate PLM approach, presupposes a determined commitment at the executive level to remedy past abuses, as well as current shortcomings in the everyday collaborations that require data to be readily accessible, handled, used and modified for others to use.

Other top-of-enterprise solutions fall short compared to well-implemented PLM environments in significant ways. Enterprise resource planning (ERP) focuses on bills of materials (BOMs) and the costs and revenues impacting ROI. Manufacturing execution systems (MES) are at the core of well-run production operations but manage only that information. Product data management (PDM) lives on as PLM’s predecessor in small- and medium-sized organizations. Customer relationship management (CRM) focuses on pinpointing customer needs.

Effective data governance means overseeing the implementations of all new solutions to ensure that capabilities work as promised. It is at the heart of reinforcing a sound management of all forms of data. It is so important that CIMdata added a data governance practice to its strategic consulting offerings more than five years ago

A key tool for reinforcing the effective management of data is configuration management (CM). Usually implemented within data governance, CM ensures that an enterprise’s products, processes, facilities, services, networks, assets and IT systems are what they are intended to be and properly optimized for their intended use.

CM tracks all forms of data and clarifies changes and modifications to it, even if unintended or undetected, throughout the product lifecycle. A comprehensive CM approach also pinpoints and fixes problems, avoiding surprises such as hidden errors and unexpected outages.

Effective data management calls for tools to have continuous access to data, regardless of changes and formats. With proper security and syntax extensions to untangle conflicts, omissions and errors, as well as formats that have become obsolete.

The Role of Top Management in Data Management

Establishing and reinforcing effective information management is not specific to information technology, engineering technology or even operational technology. It is instead tied to the executive suite’s concerns about what the enterprise can achieve, at what cost, with what products and services, and at what risk. This is why I insist that reinforcement strategies and tactics should parallel and enable business models.

With all current data challenges, only naive managers would think that their data management tools, systems and policies—even the latest and greatest—will continue to function adequately. This is why top management’s reinforcement role is primarily about people. New policies, procedures and plans must be thought through, implemented and monitored.

Most important, of course, is getting key people to see their responsibilities in new ways. Everything digital in the enterprise is already being impacted. Top management should plan accordingly and brace subordinates for unexpected and even unforeseeable developments.

To sum up, reinforcing the sound management of data is primarily about how people perceive and do their jobs, rather than the digital tools and systems they use. Implementing these reinforcement practices puts a premium on persuasion, which should be a key skill of any high-level manager.

The post The Role of Top Leadership Amid PLM and Data Management appeared first on Engineering.com.

]]>