IoT - Engineering.com https://www.engineering.com/category/technology/iot/ Wed, 03 Sep 2025 15:40:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png IoT - Engineering.com https://www.engineering.com/category/technology/iot/ 32 32 Is Nvidia’s Jetson Thor the robot brain we’ve been waiting for? https://www.engineering.com/is-nvidias-jetson-thor-the-robot-brain-weve-been-waiting-for/ Wed, 03 Sep 2025 15:39:58 +0000 https://www.engineering.com/?p=142562 Last month Nvidia launched it’s powerful new AI and robotics developer kit Nvidia Jetson AGX Thor. The chipmaker says it delivers supercomputer-level AI performance in a compact, power-efficient module that enables robots and machines to run advanced “physical AI” tasks—like perception, decision-making, and control—in real time, directly on the device without relying on the cloud. […]

The post Is Nvidia’s Jetson Thor the robot brain we’ve been waiting for? appeared first on Engineering.com.

]]>

Last month Nvidia launched it’s powerful new AI and robotics developer kit Nvidia Jetson AGX Thor. The chipmaker says it delivers supercomputer-level AI performance in a compact, power-efficient module that enables robots and machines to run advanced “physical AI” tasks—like perception, decision-making, and control—in real time, directly on the device without relying on the cloud.

It’s powered by the full-stack Nvidia Jetson software platform, which supports any popular AI framework and generative AI model. It is also fully compatible with Nvidia’s software stack from cloud to edge, including Nvidia Isaac for robotics simulation and development, Nvidia Metropolis for vision AI and Holoscan for real-time sensor processing.

Nvidia says it’s a big deal because it solves one of the most significant challenges in robotics: running multi-AI workflows to enable robots to have real-time, intelligent interactions with people and the physical world. Jetson Thor unlocks real-time inference, critical for highly performant physical AI applications spanning humanoid robotics, agriculture and surgical assistance.

Jetson AGX Thor delivers up to 2,070 FP4 TFLOPS of AI compute, includes 128 GB memory, and runs within a 40–130 W power envelope. Built on the Blackwell GPU architecture, the Jetson Thor incorporates 2,560 CUDA cores and 96 fifth-gen Tensor Cores, enabled with technologies like Multi-Instance GPU. The system includes a 14-core Arm Neoverse-V3AE CPU (1 MB L2 cache per core, 16 MB shared L3 cache), paired with 128 GB LPDDR5X memory offering ~273 GB/s bandwidth.

There’s a lot of hype around this particular piece of kit, but Jetson Thor isn’t the only game in town. Other players like Intel’s Habana Gaudi, Qualcomm RB5 platform, or AMD/Xilinx adaptive SoCs also target edge AI, robotics, and autonomous systems.

Here’s a comparison of what’s available currently and where it shines:

Edge AI robotics platform shootout

Nvidia Jetson AGX Thor

Specs & Strengths: Built on Nvidia Blackwell GPU, delivers up to 2,070 FP4 TFLOPS and includes 128 GB LPDDR5X memory—all within a 130 W envelope. That’s a 7.5 times AI compute leap and 3 times better efficiency compared to the previous Jetson Orin line. Equipped with 2,560 CUDA cores, 96 Tensor cores, and a 14-core Arm Neoverse CPU. Features 1 TB onboard NVMe, robust I/O including 100 GbE, and optimized for real-time robotics workloads with support for LLMs and generative physical AI.

Use Cases & Reception: Early pilots and evaluations are taking place at several companies, including Amazon Robotics, Boston Dynamics, Meta, Caterpillar, with pilots from John Deere and OpenAI.

Qualcomm Robotics RB5 Platform

Specs & Strengths: Powered by the QRB5165 SoC, combines Octa-core Kryo 585 CPU, Adreno 650 GPU, Hexagon Tensor Accelerator delivering 15 TOPS, along with multiple DSPs and an advanced Spectra 480 ISP capable of handling up to seven concurrent cameras and 8K video. Connectivity is a standout—integrated 5G, Wi-Fi 6, and Bluetooth 5.1 for remote, low-latency operations. Built for security with Secure Processing Unit, cryptographic support, secure boot, and FIPS certification.

Use Cases & Development Support: Ideal for robotics use cases like SLAM, autonomy, and AI inferencing in robotics and drones. Supports Linux, Ubuntu, and ROS 2.0 with rich SDKs for vision, AI, and robotics development.

(Read more about the Qualcom Robotics RB5 platform on Robot Report)

AMD Adaptive SoCs and FPGA Accelerators

Key Capabilities: AMD’s AI Engine ML (AIE-ML) architecture provides significantly higher TOPS per watt by optimizing for INT8 and bfloat16 workloads.

Innovation Highlight: Academic projects like EdgeLLM showcase CPU–FPGA architectures (using AMD/Xilinx VCU128) outperforming GPUs in LLM tasks—achieving 1.7 times higher throughput and 7.4 times better energy efficiency than NVIDIA’s A100.

Drawbacks: Powerful but requires specialized development and lacks an integrated robotics platform and ecosystem.

The Intel Habana Gaudi is more common in data centers for training and is less prevalent in embedded robotics due to form factor limitations.

The post Is Nvidia’s Jetson Thor the robot brain we’ve been waiting for? appeared first on Engineering.com.

]]>
How digital transformation and remote monitoring drive sustainability in manufacturing https://www.engineering.com/how-digital-transformation-and-remote-monitoring-drive-sustainability-in-manufacturing/ Wed, 13 Aug 2025 17:55:34 +0000 https://www.engineering.com/?p=142081 Sustainability is no longer a peripheral concern; it’s a strategic and financial imperative.

The post How digital transformation and remote monitoring drive sustainability in manufacturing appeared first on Engineering.com.

]]>
Regulatory pressure, stakeholder expectations, and rising energy costs have made environmental stewardship critical to long-term success. For manufacturing engineers, this presents both a challenge and an opportunity: How can operations become more resource-efficient without compromising productivity?

The answer increasingly lies in digital transformation systems — specifically, in the deployment of remote monitoring technologies that turn real-time data into actionable sustainability improvements. From energy and water efficiency to predictive maintenance and emissions tracking, these technologies are reshaping how manufacturers optimize resource use and reduce their environmental footprint.

Indeed, the path to sustainable manufacturing runs through data — and remote monitoring is the bridge.

What is remote monitoring?

Remote monitoring involves using Internet of Things (IoT) sensors, embedded systems, and cloud platforms to continuously collect and analyze data from equipment, utilities, and environmental systems across a facility. This data is centralized through manufacturing execution systems (MES), enterprise resource planning (ERP) software, or dedicated building management systems (BMS).

Instead of relying on manual checks, logbooks, or periodic audits, engineers and facility managers get real-time visibility into performance metrics — allowing them to make faster, more informed decisions that directly impact sustainability.

Energy efficiency through real-time monitoring

Energy use is one of the biggest drivers of cost and carbon emissions in manufacturing. Remote monitoring enables a granular view of energy consumption across assets and zones, revealing exactly where, when, and how energy is being used or wasted.

Smart meters and sub-meters connected to a centralized dashboard can identify all sorts of conditions on the shop floor and beyond, including:

  • Idle equipment that’s consuming power during off-hours
  • HVAC systems operating outside of optimal temperature ranges
  • Lighting systems left on in unoccupied zones
  • Peak load times where demand charges can be minimized

By linking this data with control systems, manufacturers can automate load balancing, schedule equipment operations, and even initiate demand-response actions in coordination with utility providers. This reduces both energy costs and greenhouse gas emissions.

Water conservation and waste reduction

Water plays a crucial role in many manufacturing processes — from cooling and cleaning to production itself. However, leaks, inefficiencies, and overuse are common and costly. Remote monitoring helps tackle this by using flow sensors, pressure gauges, and smart valves to track water use in real time. Cooling systems can be optimized to reduce unnecessary water cycling and smart alerts can be triggered by unexpected consumption spikes, pointing to leaks or process failures. Usage trends can be analyzed to adjust cleaning cycles or reuse treated wastewater.

In plants with on-site wastewater treatment, remote monitoring can ensure compliance with discharge limits and optimize treatment operations, minimizing environmental impact while reducing chemical and energy usage.

Predictive maintenance and asset efficiency

We’ve covered this a lot in this series—for a reason. One of the most effective ways to reduce waste and energy consumption is to keep machinery operating at peak efficiency. With remote condition monitoring, engineers can track vibration, temperature, current draw, and operational hours of key equipment in real time.

Environmental monitoring and emissions tracking

Modern manufacturing operations are under pressure to reduce air emissions, particulate output, and volatile organic compounds (VOCs). Remote monitoring plays a vital role in tracking these metrics through ambient sensors, gas analyzers, and stack monitors connected to cloud systems.

These systems provide continuous emissions reporting for regulatory compliance and early warnings when emissions approach critical thresholds. They also maintain historical data used for environmental, social, and governance reporting.

This not only keeps operations within legal bounds but also supports a proactive approach to pollution prevention, enabling facilities to fine-tune combustion systems or ventilation processes based on real-time feedback.

As more facilities adopt on-site renewable energy—be it solar, wind, or combined heat and power (CHP)—managing the variability and integration of these sources becomes essential. Remote monitoring allows for dynamic balancing of solar generation output versus real-time load, battery storage availability and grid draw during peak versus off-peak hours.

This maximizes the use of clean energy, reduces fossil fuel dependency, and lowers emissions associated with energy use. In some cases, surplus energy can be fed back into the grid or redirected to storage systems, enhancing sustainability while reducing operating costs.

Digital twins and process optimization

Beyond monitoring individual systems, the technology and processes involved in digitization and digitalization (which combine to form the basis of digital transformation) enable the creation and application of digital twins of production lines or facilities. By integrating real-time monitoring data into these simulations, engineers can model energy and resource usage under different production scenarios and test any process changes before implementing them physically. This can identify optimal settings for production that also reduce scrap, cycle time, or energy used per unit produced

This capability is powerful for continuous improvement and sustainability planning, allowing facilities to adapt quickly to new customers or product mixes.

The engineering advantage

For manufacturing engineers, the integration of remote monitoring technologies into digital transformation strategies isn’t just a sustainability move — it’s a smarter way to run a business. These systems deliver granular, real-time insights that enable better decisions, faster response, and long-term efficiency.

As sustainability becomes more tightly linked to profitability, risk management, and brand reputation, engineers who understand and embrace these technologies will be best positioned to lead their organizations into a more resource-efficient and environmentally responsible future.

The post How digital transformation and remote monitoring drive sustainability in manufacturing appeared first on Engineering.com.

]]>
From chain-of-thought to agentic AI: the next inflection point https://www.engineering.com/from-chain-of-thought-to-agentic-ai-the-next-inflection-point/ Fri, 08 Aug 2025 15:11:08 +0000 https://www.engineering.com/?p=141980 AI that thinks versus AI that acts. Autonomously. Systemically. At scale.

The post From chain-of-thought to agentic AI: the next inflection point appeared first on Engineering.com.

]]>
We have learned to prompt AI. We have trained it to explain its reasoning. And we have begun to integrate it as a co-pilot or ‘co-assistant’ in science, product design, engineering, manufacturing and beyond—to facilitate enterprise-wide decision-making.

But even as chain-of-thought (CoT) prompting reshaped how we engage with machines, it also exposed a clear limitation: AI still waits for us to tell it what to do.

Often in engineering, the hardest part is not finding the right answer—it is knowing what the right question is in the first place. This highlights a critical truth: even advanced AI tools depend on human curiosity, perspective, and framing.

CoT helps bridge that gap, but it is still a people-centered evolution. As AI begins to reason more like humans, it raises a deeper question: Can it also begin to ask the right questions, not just answer them? Can the machine help engineers make product development or manufacturing decisions?

As complexity escalates and time-to-decision contracts, reactive monolithic enterprise systems alone will no longer suffice. We are entering a new era—where AI stops assisting and starts orchestrating.

Welcome to the age of agentic AI.

Chain-of-thought: transformational but not autonomous

CoT reasoning is a breakthrough in human-AI collaboration. By enabling AI to verbalize intermediate steps and reveal transparent reasoning, CoT has reshaped AI from an opaque black box into a more interpretable partner. This evolution has bolstered trust, enabling domain experts to validate AI outputs with greater confidence. Across sectors such as engineering, R&D, and supply chain management, CoT is accelerating adoption by enhancing human cognition.

Yet CoT remains fundamentally reactive. It requires human prompts and structured queries to function, lacking autonomy or initiative. In environments rife with complexity—thousands of interdependent variables influencing product development, manufacturing, and supply chains—waiting for human direction slows response and restricts scale.

Consider a product design review with multiple engineering teams navigating dynamic regulatory demands, supplier constraints, and shifting market trends. CoT can clarify reasoning or suggest alternatives, but it cannot autonomously prioritize design changes or coordinate cross-functional decisions in real time.

CoT is just the visible tip of the iceberg. While it connects to the underlying data plumbing, the real shift lies in how AI can interrogate these relationships meaningfully—and potentially uncover new ones. That is where things start to tip from reasoning to autonomy, and the door opens to agentic AI.

From logic to autonomous action

Agentic AI represents a fundamental leap from the prompt-response paradigm. These systems initiate, prioritize, and adapt. They fuse reasoning with goal-driven autonomy—capable of contextual assessment, navigating uncertainty, and taking independent action.

Self-directed, proactive, and context-aware, agentic AI embodies a new class of intelligent software—no longer answering queries alone but orchestrating workflows, resolving issues, and closing loops across complex value chains.

As Steven Bartlett noted in a recent DOAC podcast: “AI agents are the most disruptive technology of our lifetime.” They will not just change how we work—they will change what it means to work, reshaping roles, decisions, and entire industries in their wake.

The 2025 Trends in Artificial Intelligence report from Bond Capital highlights this transition, describing autonomous agents as evolving beyond manual interfaces into core enablers of digital workflows. The speed and scope of this transformation evoke the early days of the internet—only this time, the implications promise to be even more profound.

Redefining the digital thread

Agentic AI rewires the digital thread—from passive connectivity to proactive intelligence across the product lifecycle. No longer static, the thread becomes adaptive and autonomous. Industry applications are wide:

  • In quality, AI monitors sensor streams, predicts anomalies, and triggers resolution—preventing defects before they occur.
  • In configuration management, agents detect part-software-supplier conflicts and self-initiate change coordination.
  • In supply chain orchestration, disruptions prompt real-time replanning, compliance updates, and automated documentation.

The result: reduced cycle times, faster iteration, and proactive risk mitigation. Can the digital thread become a thinking, dynamic learning, acting ecosystem—bridging data, context, and decisions?

Nevertheless, the transformation is not just technical:

  • Trust and traceability: Autonomous decisions must be explainable, especially in regulated spaces.
  • Data readiness: Structured, accessible data is the backbone of agentic performance.
  • Integration: Agents must interface with PLM, ERP, digital twins, and legacy systems.
  • Leadership and workforce evolution: Engineers become orchestrators and interpreters. Leaders must foster new models of human-AI engagement.

This shift is from thinking better to acting faster, smarter, and more autonomously. Agentic AI will redraw the boundaries between systems, workflows, and organizational silos.

For those ready to lead, this is not just automation, it is acceleration. If digital transformation was a journey, this is the moment the wheels leave the ground.

Building trustworthy autonomy

The road ahead is not about AI replacing humans—but about shaping new hybrid ecosystems where software agents and people collaborate in real time.

  • We will see AI agents assigned persistent roles across product lifecycles—managing variants, orchestrating compliance, or continuously optimizing supply chains.
  • These agents will not just “assist” engineers. They will augment system performance, suggesting better configurations, reducing rework, and flagging design risks before they materialize.
  • Organizations will create AI observability frameworks—dashboards for tracking, auditing, and tuning the behavior of autonomous agents over time.

Going forward, we might not just review dashboards—we might be briefed by agents that curate insights, explain trade-offs, and propose resolutions. To succeed, the next wave of adoption will hinge on governance, skill development, and cultural readiness:

  • Governance that sets transparent bounds for agent behavior, and continuous purposeful adjustments.
  • Skills that blend domain expertise with human-AI fluency.
  • Cultures that treat agents not as black boxes—but as emerging teammates or human extensions.

Crucially, managing AI hallucination—where systems generate plausible but inaccurate outputs—alongside the rising entropy of increasingly complex autonomous interactions, will be essential to maintain trust, ensure auditable reasoning, and prevent system drift or unintended behaviors.

Ultimately, the goal is not to lose control—but to gain new control levers. Agentic AI will demand a rethink not just of tools—but of who decides, how, and when. The future is not man versus machine. It should be machine-empowered humanity—faster, more adaptive, and infinitely more scalable.

The post From chain-of-thought to agentic AI: the next inflection point appeared first on Engineering.com.

]]>
5 more project deliverables that drive digital transformation success https://www.engineering.com/5-more-project-deliverables-that-drive-digital-transformation-success/ Wed, 06 Aug 2025 14:04:08 +0000 https://www.engineering.com/?p=141919 Some significant project deliverables that are essential to every digital transformation project plan.

The post 5 more project deliverables that drive digital transformation success appeared first on Engineering.com.

]]>
Digital transformation often changes the way organizations create value. Technology adoption and the growing reliance on digital systems significantly benefit engineers.

From predictive analytics, robotic process automation and artificial intelligence to software that enhances collaboration, organizations are leveraging digital technology to create, capture, and deliver value in new and innovative ways.

Well-defined digital transformation project deliverables lead to project success. Vaguely defined or missing project deliverables create a risk of missed expectations, delays and cost overruns.

Here are the significant project deliverables essential to every digital transformation project plan:

  1. Project charter
  2. Data analytics and visualization strategy
  3. Generative AI strategy
  4. Data conversion strategy
  5. Data profiling strategy
  6. Data integration strategy
  7. Data conversion testing strategy
  8. Data quality strategy
  9. Risk Assessment
  10. Change management plan
  11. Data conversion reports

These deliverables merit more attention for digital transformation projects than routine systems projects. We’ll examine the deliverables six to eleven in this article. To read the first article, click here.

Data integration strategy

Organizations achieve most of the value of digital transformation from new data-based insights and previously impossible process efficiencies. The insights are revealed through sophisticated data analytics and visualization, which rely on data integration. The process efficiencies are achieved in the same way.

The data integration strategy deliverable describes how data from every data source will be integrated with data from other data sources. The primary integration strategies are:

  • Populate a lakehouse or a data warehouse with data from multiple data sources. This strategy persists all the integrated data. Implementation and operation are more expensive but deliver the most integration opportunities and the best query performance.
  • Keep data sources where they are and populate cross-reference tables for key values. This strategy persists only key values to create the illusion of integration. It’s appealing because it’s the easiest and cheapest to implement and operate. It sacrifices query performance.

The value of the data integration strategy lies in informing the software development effort for each of the likely tools, such as extract, transform, and load (ETL), data pipelines, and custom software.

Data conversion testing strategy

It’s impossible for engineers to overestimate the effort required for data conversion testing. Every digital transformation project will involve some risk of data degradation as a result of the conversion process.

The data conversion testing strategy is a deliverable that describes who and how the project will test the data conversion and considers the following:

  • Automation opportunities.
  • Statistical analysis techniques.
  • Mismatches in joins of keys and foreign keys.
  • Strategies for minimizing manual inspection.

The value of the data conversion testing strategy lies in describing the skills and estimating the effort needed to confirm the adequacy of the data conversion.

Data quality strategy

The current data quality in the data sources will almost always disappoint. That reality leads to a data quality strategy deliverable for digital transformation that describes the following:

  • Acceptable data quality level for at least the most critical data elements.
  • Data enhancement actions that will augment available data to more fully meet end-user expectations.
  • Data quality maintenance processes to maintain data quality for the various data sources.

The data quality strategy enables the organization to achieve a consensus on what constitutes sufficient data quality and which data elements are most critical.

Risk assessment

Digital transformation projects experience various risks. This deliverable reminds management that these projects are not simple or easy.

The specific risks that digital transformation projects experience include:

  • Risk of data issues, such as insufficient quality and complex integration. These issues will add cost and schedule to digital transformation projects.
  • The risk of viewing AI as a silver bullet. How can engineers reduce AI model hallucinations? explains how to alleviate this risk.
  • The risk of data analytics and visualization software complexity. This complexity adds to the cost of training and creates a risk of misleading results.
  • The risk of an overly ambitious scope overwhelming the team and budget. This unrealizable scope creates disappointment and reduces commitment to the project.

Identifying and assessing potential risks associated with the transformation will help the project team mitigate risks and minimize their impact if they become a reality. This article How engineers can mitigate AI risks in digital transformation discusses the most common ones.

Change management plan

Digital transformation projects almost always introduce significant changes to business processes. Successful projects define a comprehensive people change management plan that includes:

  • Stating the change management goal, approach and supporting objectives.
  • Engaging project sponsors and stakeholders.
  • Description of the current situation.
  • Description of change management strategies, such as training, change agents and end-user support.
  • Recommended change management roles and resources.
  • Description of the approximate timeline.

This deliverable, when conducted as planned, ensures end-users, such as engineers, adopt new processes and digital tools. Without that adoption, the organization will not realize the benefits of the contemplated project.

Data conversion reports

The data conversion test report contains the results of the data conversion design and testing work for each datastore. It includes:

  • The number of software modules developed to perform data conversion.
  • A summary of the number of rows that couldn’t be converted with a cause.
  • A summary of the data quality improvements required.

The final data conversion report contains the results of the data conversion work by datastore. It includes a summary of the:

  • Number of rows converted.
  • Data quality improvements that were made.
  • Data issues that were not addressed.

These deliverables often frustrate stakeholders because engineers invariably recommend more effort to:

  • Improve historical data quality.
  • Strengthen data quality processes to maintain future data quality.

Paying detailed attention to these digital transformation project deliverables will ensure the success of your project. This list of deliverables is a subset of the overall set of deliverables that all projects typically complete.

The post 5 more project deliverables that drive digital transformation success appeared first on Engineering.com.

]]>
Data pipelines in manufacturing https://www.engineering.com/data-pipelines-in-manufacturing/ Fri, 04 Jul 2025 18:39:43 +0000 https://www.engineering.com/?p=141115 A beginner’s guide to the basics of data for manufacturers.

The post Data pipelines in manufacturing appeared first on Engineering.com.

]]>
It’s no secret that manufacturing is quickly becoming a data driven environment. With that, the ability to collect, move, and make sense of information from machines and systems is becoming a core skill for engineers. From tracking machine performance to automating quality checks, data is no longer just a byproduct—it’s a strategic asset. At the heart of this shift is the data pipeline: the invisible but essential infrastructure that moves information from where it’s generated to where it can be used.

If you’re a manufacturer only just opening your eyes to the world of industrial data, this article will walk you through what a data pipeline is, how it works, and why it matters.

What is a data pipeline?

A data pipeline is a series of steps or processes used to move data from one place (the source) to another (the destination), often with some form of processing or transformation along the way. Think of it as an automated conveyor belt for information, designed to reliably carry data from machines, sensors, or systems to dashboards, databases, or analytics platforms.

In a manufacturing setting, a data pipeline might start at a temperature sensor on a CNC machine, pass through an edge device or gateway, and end up in a cloud-based dashboard where a plant manager can monitor operations in real time.

To understand how a data pipeline works, it helps to break it down into its basic parts. Data starts at the source. In manufacturing, common sources include sensors, machines, control systems and human inputs. Each source generates raw data such as numbers, states, or measurements that provide insight into how the equipment or process is performing.

Once data is generated, it needs to be collected and moved. This is often done using industrial protocols. OPC UA is a common standard for industrial automation systems. MQTT is a messaging protocol often used for sending data from edge devices. Modbus or Ethernet/IP are industrial stalwarts used to communicate with legacy equipment.

At this stage, edge devices may act as the bridge between your OT (Operational Technology) equipment and your IT infrastructure. Most of the time, this raw data needs to be cleaned, formatted, or enriched before it’s useful. This processing could involve filtering out noise or irrelevant data, averaging values over time, tagging data with machine IDs, timestamps, or batch numbers and detecting anomalies or generating alerts. All of this can occur at the edge, in a local server, or in the cloud, depending on the application.

After processing, data is stored or delivered to its final destination. This could be dashboards for real-time monitoring or databases or data lakes for long-term analysis. Manufacturing Execution Systems (MES) or ERP platforms are a prime destination for almost all manufacturing data and machine learning models will use the data for predictive analytics. The key is that data ends up somewhere it can be digested and acted upon, whether by people or machines.

Why data pipelines matter in manufacturing

Data and connectivity on the shop floor has been a reality for many years, but most of that time cost and complexity of the technology meant it was adopted by only the largest manufacturers. But advances in chip technology, AI and cloud connectivity mean even small manufacturers can implement these powerful technologies. As competition, complexity, and customer expectations grow, so does the need for smaller manufacturers to invest in connected, data-driven operations.

There are key benefits of implementing data pipelines that help manufacturers see quick return on investment, such as real-time visibility of what’s happening the shop floor instead of after a shift ends; noticing warning signs of failure before breakdowns occur; catching defects early using data from sensors and vision systems; monitoring usage patterns and energy requirements, and; tracking every part and process for compliance and recall readiness.

Indeed, even a basic data pipeline can replace clipboard checklists and Excel spreadsheets with automated, actionable insights. You don’t need to employ an army of data scientists to implement much of the current technology—most of it is designed with manufacturers’ deployment needs in mind, and low-code options are growing rapidly.

Start small, think big

If you’re new to data pipelines, the key is to start small. Pick one machine, one sensor, or one metric that matters. Build a basic pipeline that helps you see something you couldn’t before—then grow from there.

As factories become smarter and more connected, manufacturing engineers who understand how to harness data will be at the forefront of process innovation, quality improvement, and operational efficiency. Next time you look at a machine, don’t just see it as a tool, see it as a source of insight waiting to be unlocked.

The post Data pipelines in manufacturing appeared first on Engineering.com.

]]>
Dealing with legacy software during a digital overhaul https://www.engineering.com/dealing-with-legacy-software-during-a-digital-overhaul/ Tue, 10 Jun 2025 14:50:12 +0000 https://www.engineering.com/?p=140448 Columnist and manufacturing engineer Andrei Lucian Rosca explains how legacy software and systems are important pieces of the digital transformation puzzle.

The post Dealing with legacy software during a digital overhaul appeared first on Engineering.com.

]]>
The big dilemma everyone faces when overhauling digital platforms is what should the business do with legacy software? In this context, the word “legacy” represents outdated tools, software or hardware that are still being used by companies and are still vital for operations. Their age and outdated nature pose various problems, such as high maintenance costs, security vulnerabilities and integration issues, but they can be integral to the day-to-day operations of a company.

Throughout my career, I have had exposure to several types of legacy software in different companies and industries. Organizations deal with the idea of transforming to a digital platform in one of three ways: they view the legacy software as crucial and must be integrated (high resistance), they keep using it in parallel to a digital approach regardless of the high cost, or they transition completely to a digital system, which is surprisingly the least common approach.

During my time working for a global automotive company, I encountered a semi-collaborative approach to data sharing and working together. The main problem for engineering was working together in a private ecosystem. This was caused by several factors— different rules and regulations at each location, legacy software and a legacy mentality driven by several acquisitions that were never fully integrated. Our north star became the migration to a digital platform that bridged the divide and got the locations to work together. This approach was ultimately successful, and members of the organization could easily work on their projects from any location.

 Of course, issues appeared with the transition to a digital thread, including numbering schemes and streamlining or adapting processes to local needs. I learned that it is very easy to fall down the rabbit hole if you entertain every little detail. Instead, your main drive should always be to the agreed scope. During this transition, I had to quell a lot of debates on minor things that could have derailed the scope of our projects, and there were a lot of projects in the initial phase, such as our desired outcomes from moving to a complete digital thread, which software to migrate or discontinue, vendor selection and many others.

Indeed, it’s worth taking time at the beginning to design your solution as thoroughly as possible—it saves a lot of headaches down the road and most importantly, saves money. The role of an engineer in this specific spot is to balance out the budget with features. First and foremost, in this role you must bridge the divide between design and manufacturing, this was one of the first things that I learned as I was cutting my teeth in my first engineering job. You can design a product or a solution as neat as possible, but at the end you must produce it and to produce a product is a whole other beast than just drawing it on your computer. Understanding both the design component and having a surface understanding of how the product is manufactured gave me enough credit with the shopfloor people that I became the go to person for the head of manufacturing to present their topics and work with them to be able to incorporate them in the implementation process.

One of the most important but frequently ignored topics is user acceptance. People who are working with a specific software are usually SME (subject matter experts) and know the software in detail. Because of this, it can be tough to gain buy-in, but they are your most important asset in a legacy software to digital thread transformation. They have depth of knowledge that is critical to a successful migration or transition. Who knows LS outputs? Who knows how the processes were designed? Who knows which person down the process needs to be informed? The subject matter expert will make your life a thousandfold easier, so include them as early as possible, align on scope and have them help you build it.

If I were to choose one thing to avoid at all costs during a digital transformation, it would be ignoring parts of the organization. My success in this project was a result of the frequent consultation with the people handling day-to-day business of the organization. Since we started with several locations during ramp up, we ended up working very closely with people from all over the production process. This resulted in rapid feedback on anything that we did—especially on what we did wrong. That feedback is crucial, as we could incorporate it and adapt from sprint to sprint.

Legacy software is still present in many companies, but it should not be seen as malign pieces of a process that would kill a project before it starts. Rather, it was an important piece of the puzzle that fit the organization at a specific time in its existence, and as organizations mature and digital becomes the new norm, legacy software should be considered an important aspect of a migration scenario, even if it will ultimately be replaced.

Andrei Lucian Rosca is an engineer with a bachelor’s in mechanical engineering focusing on CAD software with more than 10 years of experience in Digital Transformation projects in several industries, from automotive to consumer goods. I am currently exploring innovative solutions (e.g. IoT, AI) and how to include them in future projects.

The post Dealing with legacy software during a digital overhaul appeared first on Engineering.com.

]]>
How digital transformation augments smart building technology https://www.engineering.com/how-digital-transformation-augments-smart-building-technology/ Mon, 28 Apr 2025 14:44:17 +0000 https://www.engineering.com/?p=139219 Understanding and leveraging these integrations is a crucial hack for building resilient, future-ready operations.

The post How digital transformation augments smart building technology appeared first on Engineering.com.

]]>
As manufacturing facilities continue to adopt sustainable practices, digital transformation (DX) technologies are becoming indispensable tools for improving operational efficiency and environmental stewardship.

Among the least talked about integrations are DX systems combined with smart building infrastructure, such as lighting, heating, and cooling systems. These integrations enable real-time decision-making and intelligent energy management, which are crucial for achieving sustainability goals while keeping costs in check.

The role of digital transformation in smart building technology

Digital transformation encompasses a broad suite of technologies, including Internet of Things (IoT), artificial intelligence (AI), machine learning (ML), and cloud computing. When applied to building infrastructure systems—specifically lighting, heating, ventilation, and air conditioning (HVAC)—these technologies enable facilities to move from reactive to proactive and even autonomous operational models.

For example, IoT sensors embedded in lighting and HVAC systems collect real-time data on occupancy, ambient light, temperature, and air quality. DX platforms aggregate and analyze this data to optimize environmental conditions and energy usage dynamically. The result is a highly responsive facility that adjusts its energy usage based on actual need, not static schedules.

Smart lighting—beyond energy efficiency

Smart lighting systems are often the entry point for many manufacturers beginning their sustainability journey. However, these systems do more than just save energy through LED technology and motion sensors. When integrated with DX platforms, smart lighting systems can adjust brightness and color temperature based on the day’s task requirements. These systems integrate with occupancy and workflow data to provide optimal lighting only where and when needed. Lastly, the data collected by these systems will offer insights into space utilization, contributing to more efficient layout planning and capacity management.

When tied into a broader digital transformation ecosystem, lighting data can also add context to safety and productivity metrics, enabling data-driven improvements to workplace conditions.

Intelligent climate control

Smart HVAC systems aren’t new and have become a vital tool for many manufacturers. In environments where temperature and humidity control are critical—such as in pharmaceuticals, electronics, or food manufacturing—DX integration provides an edge.

AI-driven HVAC systems use predictive algorithms and the production schedule to deal with changes in external weather conditions, internal heat generation, equipment usage and product mix. These systems can:

  • Adjust airflow and temperature in zones based on real-time occupancy and process demands
  • Schedule maintenance based on predictive analytics, reducing downtime and energy waste
  • Learn from historical data to optimize performance across seasons and production patterns

Data integration and visualization

One of the key advantages of digital transformation is its ability to unify data streams from disparate systems into a single platform. Manufacturing engineers can visualize lighting, HVAC, production, and utility data on centralized dashboards. This helps identify energy-intensive areas and inefficiencies while correlating environmental conditions with production metrics, creating and tracking more accurate sustainability KPIs in real time. With advanced analytics, facilities can simulate different energy scenarios, forecast future consumption based on orders, and model ROI on proposed sustainability investments.

Interoperability and open standards

For maximum impact, DX systems should be built on open standards that support interoperability among devices and platforms. Manufacturing facilities often use equipment and systems from multiple vendors. Ensuring that lighting, HVAC, and digital solutions can “talk” to each other minimizes integration challenges and futureproofs the infrastructure. Middleware and APIs are increasingly being used to bridge communication gaps, layering advanced controls and analytics without replacing existing systems.

Sustainability and compliance

Sustainability in manufacturing has become a competitive imperative, with many governments and industry bodies enforcing stricter energy efficiency and emissions standards. Digital transformation systems provide the transparency and traceability needed to demonstrate compliance with regulations such as ISO 50001 (energy management) and ASHRAE standards.

AI, edge computing, and autonomy

The next frontier in DX-enabled smart environments is edge computing and autonomous control. Instead of sending all data to the cloud, edge devices can process information locally, enabling faster decision-making and reducing latency.

For example, a local controller might detect a sudden drop in occupancy and immediately dim lights and reduce HVAC output in that zone—without waiting for a central system to tell it what to do. This distributed intelligence model enhances responsiveness and resilience, especially important in large or multi-site manufacturing operations.

There are many reasons to invest in smart building infrastructure, but ultimately companies opt to install smart lighting, heating, and cooling technologies to reduce costs. Integrating these systems into a digital transformation regime can transform passive infrastructure into intelligent assets that contribute directly to efficiency, comfort, compliance, and cost reduction. For manufacturing engineers, understanding and leveraging these integrations is critical for meeting today’s sustainability standards but also for building resilient, future-ready operations.

The post How digital transformation augments smart building technology appeared first on Engineering.com.

]]>
How digital transformation boosts sustainability in manufacturing https://www.engineering.com/how-digital-transformation-boosts-sustainability-in-manufacturing/ Fri, 28 Mar 2025 18:05:56 +0000 https://www.engineering.com/?p=138195 Here's a few key ways digital transformation drives environmental and operational benefits.

The post How digital transformation boosts sustainability in manufacturing appeared first on Engineering.com.

]]>

By now, you probably already know that digital transformation is a manufacturing strategy that integrates advanced digital technologies to enhance efficiency, reduce waste and optimize resource use. But one angle that’s not always talked about is that the right application of these digital technologies can significantly improve sustainability.

Below is an examination into key areas where digital transformation drives environmental and operational benefits. It includes foundational steps for beginners and a few advanced techniques for more experienced engineers further down the digital transformation road.

Data-driven decision-making for sustainable operations         

At the foundation of digital transformation is data collection and analytics, which enable real-time tracking of key sustainability metrics such as energy consumption, material waste and emissions. IoT (Internet of Things) sensors, SCADA (Supervisory Control and Data Acquisition) systems and AI-driven analytics help manufacturers make informed decisions that optimize production efficiency while minimizing waste.

Beginners can start with IoT-enabled sensors to monitor machine performance, energy usage and material waste. From there, use basic dashboards to visualize trends and identify inefficiencies. The next step is to implement predictive maintenance using these insights to reduce unexpected breakdowns and extend machine lifespan.

Advanced users may have already deployed digital twins to create a virtual model of production systems, allowing their engineers to test optimizations before making real-world changes. AI-powered anomaly detection can automatically adjust machine parameters and reduce energy waste while integrating machine learning (ML) algorithms to analyze historical data to improve production scheduling and minimize resource-intensive downtime.

Energy efficiency and carbon footprint reduction

Manufacturing facilities are obviously energy-intensive, but energy management systems can significantly lower power consumption and carbon emissions without disrupting production.

For beginners, smart meters and IoT sensors can track energy consumption at different production stages. Once you have this data, implement automated power-down schedules for non-essential equipment during off-peak hours.

More advanced users can integrate AI-driven load balancing to redistribute energy usage across equipment dynamically. They may decide to explore microgrid solutions that combine renewable energy sources (solar, wind) with energy storage for more sustainable operations. Carbon footprint tracking software will make it easier comply with environmental, social and governance (ESG) standards and improve sustainability reporting.

Optimization for sustainable sourcing and logistics

A sustainable supply chain reduces emissions, optimizes material use and ensures responsible sourcing throughout a manufacturer’s network. Digital tools help companies improve inventory management, optimize transport routes and reduce overproduction.

If you haven’t already, implement cloud-based inventory management systems to track raw materials, reducing excess stock and waste. PLM and ERP software are the gold standard for this, but smaller manufacturers may not need all of the functionality these platforms provide and might decide to piece together the functionality they want using smaller software platforms that require less investment and cause less disruption during start-up. The goal is to gather enough data to use demand forecasting to avoid overproduction and prevent obsolete inventory or costly overstock. Next, implement a supply chain platform to ensure ethical sourcing and reduce supplier-related inefficiencies.

Advanced users are likely at least considering AI-driven dynamic routing systems for delivery fleets, optimizing transportation routes to reducing fuel consumption. RFID and GPS tracking will monitor product movement and optimize storage conditions, reducing spoilage. Next, establish closed-loop supply chains, where returned or defective materials are reintegrated into production rather than wasted.

Waste reduction and circularity

Everyone knows minimizing waste is critical for sustainable manufacturing. Advanced digital tools help manufacturers keep a handle on waste by tracking, sorting and helping repurpose materials efficiently.

Start with robust defect detection to reduce waste caused by faulty production runs. Introducing 3D printing (additive manufacturing) to minimize material waste and create precise, on-demand parts could make sense for a growing number of manufacturers. A basic data-fuelled recycling programs for metal, plastic and other byproducts can keep waste under control.

For advanced users, AI-powered sorting systems to automatically separate and classify waste materials for recycling can improve results of any recycling program. Digital product lifecycle tracking will accommodate customer product returns for disassembly and reuse, potentially taking the edge of raw material costs as digital and smart advanced remanufacturing strategies will help refurbish returned components and reintroduce them into production lines.

Smart manufacturing for sustainable production

Industry 4.0 technologies like automation, robotics, cloud computing and AR (augmented reality) can significantly reduce resource waste and improve efficiency in manufacturing environments.

Beginners can start by implementing basic robotics for repetitive tasks to improve precision and reduce material waste. Cloud-based collaboration tools will reduce paperwork and streamline production planning. Adopting AR-based training modules allows employees to learn new skills without exhausting physical materials.

Advanced users might look to deploy AI-powered collaborative robots (cobots) to enhance precision manufacturing and minimize errors, all while collecting valuable data. Edge computing from devices on the line analyzes machine data locally (rather than in the cloud). This reduces energy consumption for data processing and gives the impetus to implement real-time digital simulation models that predict potential disruptions and adjust production accordingly.

Key takeaways for manufacturing engineers

For those just starting, begin by implementing IoT sensors, analytics and basic automation to monitor and improve sustainability.

For experienced engineers: Use advanced AI, blockchain and digital twins to optimize energy, supply chains and circularity.

Whether you are just starting your digital journey or are an advanced user of the latest digital technologies, it’s important to understand efficiency is just one piece of the payback delivered by digital transformation—it’s about future-proofing operations which includes reducing environmental impact.

The post How digital transformation boosts sustainability in manufacturing appeared first on Engineering.com.

]]>
Industry 4.0 gets a curriculum developed by ASME and Autodesk https://www.engineering.com/industry-4-0-gets-a-curriculum-developed-by-asme-and-autodesk/ Fri, 07 Mar 2025 20:05:53 +0000 https://www.engineering.com/?p=137440 Six free courses benefit educators, engineering students and engineers.

The post Industry 4.0 gets a curriculum developed by ASME and Autodesk appeared first on Engineering.com.

]]>
Educators and engineering firms looking for training on Industry 4.0 have a new resource: a six course curriculum for smart manufacturing created by the American Society of Mechanical Engineers (ASME) and Autodesk, Inc. The two organizations collaborated through 2023 and 2024 to compile interviews, analyze data and come up with real-world examples for the set of six free online courses. The lessons cover evolving engineering skills, including Artificial Intelligence (AI) and robotics, design for sustainability, Industry 4.0, data skills and business and digital literacy. 

“We began this effort based on feedback from educators. Industry 4.0 is here and companies are struggling because they’re not workforce ready. Much of the knowledge being taught in the classroom is not geared to digital strategies,” says Pooja Thakkar Singh, program manager for the American Society of Mechanical Engineers. The courses are meant for mechanical engineers, manufacturing engineers and Computer Numerical Control (CNC) machinists. The sixth course contains examples of R&D work that show how to apply acquired knowledge to projects using Autodesk’s Fusion software. The overall goal of the curriculum is to empower educators and equip students and professionals with in-demand skills that will advance their careers and support modern manufacturing.

“The courses run between 30 and 45 minutes, with the underlay of the Fusion exercises being a little bit deeper. The software is very easy to use. This is advanced manufacturing,” says Debra Pothier, senior manager for Autodesk for strategy for architecture, engineering, construction and operations (AECO) and a partnership owner for ASME.

For example, the design for sustainability course covers how value chains and supply chains impact the environment and product lifecycles influence design solutions. It also explores the importance of the “Triple Bottom Line,” a framework that measures social, environmental and financial benefit.

A case study for the course, Evolving engineering skills, including AI, robotics and more with ASME. (Image: ASME and Autodesk)

“There were requests to make the courses as customizable as possible due to the changing landscape of Industry 4.0. We accomplished this by integrating PowerPoint slide decks and videos that faculty and instructors can switch out to keep up with the latest information,” says Singh.

Another way to customize the courses is to use Doodly, an animation software. The program can create dynamic virtual board drawings for videos. 

“Then you can remake the courses by re-recording and re-downloading content as many times as you need,” says Singh.

Critical ingredients for the courses

One of the key components of the courses is explanations of digital manufacturing skills that apply to mechanical engineering, manufacturing engineering and CNC machining, like CAM 2.5, 3-axis milling and simulation.

“Doing this impactfully involved getting all the stakeholders together in one virtual classroom. We had to ask ourselves what students were looking for and what they were passionate about. We also had to cover the challenges that industry experts were facing. We didn’t know those until we heard that from the sources,” says Singh.

Another key component is a simultaneous focus on hard skills, such as data analysis, and soft skills, like collaboration.

“We demonstrate how Fusion software facilitates the communication of generative design AI outputs, bridging the gap between technical skill and practical application,” says Curt Chan, strategic partnerships manager for Autodesk.

A third key component is explanations of the enormous impact of AI and how this tool affects the design process. In some situations, generative AI software can handle 90 percent of the programming required for machine part development. A mechanical engineer can then utilize their expertise to finetune the last 10 percent.

AI is like “a whole new toolbelt in a number of ways,” says Jason Love, technology communications manager for Autodesk.

Before AI was widely used, a mechanical engineer designing an assembly might be required to learn how to draw a diagram of the parts in an assembly.

“Now there are tools in place that with the click of a mouse, create those 2D diagrams from your 3D models. It falls to the human engineer to double check the accuracy of those drawings,” says Love.

Some educators may be unfamiliar with such changes or the new workflow itself. The courses address these problems by ensuring viewers grasp how many options AI creates.

“Faculty are going to have to teach the entire process, not their little silo,” says Pothier.

Introduction for the course: Evolving engineering skills, including AI, robotics and more with ASME. (Image: ASME and Autodesk)

How and why the curriculum works

The six courses are not critically tied to one another. This gives an educator flexibility to “plug and play.” A manager could assign a course when an employee has downtime or an educator wants to offer extra credit.

“Through conversations with educators, I’ve observed many different ways they are planning and implementing the curriculum,” says Chan.

The models have a loose sequential order. For example, the course that serves as an introduction defines the term “Industry 4.0.” It also explains the driving forces behind production processes and relates the remaining challenges from Industry 3.0. A later course on digital literacy and data skills provides participants with an understanding of Industry 4.0 technologies and data measurements. This course also gives participants an understanding of the role of big data and how numerical insights drive manufacturing processes.

Each course has a self-assessment that learners can complete to earn a certificate. Participants can earn credit or mark their skills as upgraded after completing certain courses or the entire set.

One factor contributing to the popularity of the courses is the shift during the past five years to the use of online education, for both synchronous and asynchronous learning. This is partly due to the influence of the COVID-19 pandemic. Students and engineers have also become more well versed and more highly motivated to utilize knowledge they have drawn from online content. 

ASME and Autodesk’s history of partnership

The Industry 4.0 curriculum is the latest result of ASME’s and Autodesk’s history of teamwork. The two entities have been working together since 2021. That year, Autodesk Foundation, Autodesk’s philanthropic arm, began donating funds to ASME’s Engineering for Change (E4C) research fellowship program.

As of late February 2025, the Autodesk Foundation has funded over 100 E4C fellowships to support nonprofits and startups in a range of fields. These include energy and materials development, health and resilience systems and work and prosperity opportunities. The donations to E4C have also expanded the reach and impact of Autodesk Foundation’s Impact internship program. That program connects individuals in the Autodesk Foundation portfolio with new engineers.

In 2022, ASME and Autodesk released the results of a collaborative multiphase research project on the future of manufacturing. The effort involved a research study conducted between August 2021 through May 2022. The report on the study investigated and identified the future workflows and skills required for mechanical engineering, manufacturing engineering and CNC machinist roles.

“This was the project that was the basis for the six-course curriculum. The second phase of the project was the curriculum design and creation. We piloted the first four courses by launching a competition relating to sustainability and ocean clean-up,” says Singh.

The Autodesk-hosted event featured university teams designing an autonomous robot to clean up trash from the ocean. Students relied on skills they had learned from the courses.

One of the teams in the competition, Wissen Marinos, was formed of students from India’s National Institute of Technology Silchar. Wissen Marinos team captain Pratisruti Buragohain says participating in the competition enabled team members to develop problem-solving abilities, technical skills and soft skills.

“Despite facing various hurdles along the way, we tackled each one of them strategically and with a meticulous determination. In essence, our experience throughout the competition bestowed upon invaluable lessons, equipping us with enhanced design proficiency, research skills and efficient problem-solving strategies,” says Buragohain.

Additional steps for the curriculum have included the translation and localization of the courses into Japanese and German. ASME and Autodesk are tracking how widely the curriculum is used and asking what information students and engineers are learning from it.

“Any curriculum takes time. It’s going to take time to drive it the use of this curriculum. That’s about keeping a pulse on the industry, hearing what they have to say and what Autodesk’s customers have to say,” says Chan.

Pothier says Autodesk is striving to close the skills gap and be a trusted partner to engineering firms.

“We give the underpinning of, “This is how you do it with Fusion and we’re giving you modular pieces.” We’re giving it to universities and firms in a way that students really want to consume it. Our team is very passionate because we feel if you’re going to be sending your kids to school, they need those skills today,” says Pothier.

View the courses at: https://www.autodesk.com/learn/ondemand/collection/asme-manufacturing-education-courses.

The post Industry 4.0 gets a curriculum developed by ASME and Autodesk appeared first on Engineering.com.

]]>
AI and Industry 5.0 are definitely not hype https://www.engineering.com/ai-and-industry-5-0-are-definitely-not-hype/ Mon, 24 Feb 2025 20:52:58 +0000 https://www.engineering.com/?p=137042 The biggest players in manufacturing convened at the ARC Industry Leadership Forum, and they were all-in on AI.

The post AI and Industry 5.0 are definitely not hype appeared first on Engineering.com.

]]>
There is a lingering sentiment among the manufacturing community that the trends towards AI, digitalization and digital transformation (collectively referred to as Industry 5.0) are nothing more than marketing hype designed to sell new products and software.

Nothing could be further from the truth.

Granted, any new trend will always have an element of bandwagon business from marginal players and hype-riders looking to benefit from the latest trends.

But in terms of how digital transformation and AI are being researched and implemented in the manufacturing industry, there is plenty of steak to go along with all that sizzle.

One of the best ways to distinguish between an over-hyped trend and something with substance is to watch who is watching it. A great place to see that in action was at the recent ARC Industry Leadership Forum, which took place in Orlando, Fla. February 10-13.

Nico Duursema CEO, Cerilon, delivers his keynote address at the ARC Industry Leadership Forum in Orlando, Fla. (Image: ARC Advisory Group, taken from X, formerly Twitter)

This year’s event was almost entirely focused on AI, digital transformation and Industry 5.0 in manufacturing. It attracted more than 600 attendees representing some of the biggest companies in the manufacturing sector.

Indeed, the top 30 of these attending companies with publicly available financial numbers had a combined 2023 market cap of $4.22 trillion. If this market cap were a country, it would rank as the 4th largest economy in the world, just behind Germany ($4.5 trillion GDP) and ahead of Japan ($4.20 trillion GDP). Most of these companies were users undergoing significant digital transformation initiatives.

The fact that these industrial heavyweights are already fully invested in implementing AI and digital strategies shows the scale of the opportunity, and the huge strategic risk of ignoring it—we’re talking Blockbuster Video-level strategic risk.

But the question remains: where do you begin, especially if you don’t have the capital and assets of these massive multinational businesses?

Everywhere, all at once

In the current state of things, engineering leaders can be easily overwhelmed with all the trends and challenges thrown at them. Mathias Oppelt, vice-president of customer-driven innovation at Siemens Digital Industries (Siemens is certainly a technology vendor, but also manufactures its products using the latest smart manufacturing principles), hears about this from his customers daily and summed it up nicely during his session at the ARC Forum:

“You need to act more sustainably; you need to have higher transparency across your value chain. Have you thought about your workforce transformation yet? There’s a lot of people retiring in the next couple of years and there’s not many people coming back into the into the workforce. You still must deal with cost efficiency and all the productivity measures, while also driving energy efficiency. And don’t forget about your competition—they will still be there. And then there’s all that new technology coming up, artificial intelligence, large language models, ChatGPT—and on it goes, all of that all at once.”

Sound familiar?

Even with all these challenges, everything must now be done at speed. “Speed and adaptability will be the key drivers to continuing success. You need to adapt to all these challenges, which are continuously coming at you faster. If you’re standing still, you’re almost moving backwards.” Oppelt said.

The answer is simple, offered Oppelt with a wry smile: just digitally transform. The crowd, sensing his sarcasm, responded with nervous laughter. It was funny, but everyone understood it was also scary, because no one really knows where to start.

Bite the bullet, but take small bites

“The continuous improvement engineers out there know how risky it can be to bite off more than the organization can chew or to try to drive more change than it can manage,” says Doug Warren, senior vice president of the Monitoring and Control business for Aveva, a major industrial software developer based in Cambridge, U.K.

“It helps to take bite-sized pieces, and maybe even use the first bite to drive some incremental benefit or revenue to fund the next bite and then the next bite. You can sort of see this this self funding approach emerge, assuming the business objectives and the metrics tied to those business objectives show results.”

Warren is puzzled by how slow a number of industrial segments have been to fully embrace digitalization and digital transformation, saying that “…it seems like everyone has at least dipped a toe or a foot into the water,” but the number of organizations that are doing it at scale across the whole enterprise is lower than most people would guess.

“The level of technological advancement doesn’t come as a big surprise, and where we go from here won’t be a big surprise. The trick will be how fast you get past the proof-of-concept and into full scale deployment,” he says.

From Warren’s perspective, if you’re not taking advantage of the digitalization process to fundamentally change the way you’re doing work, then you’re probably not getting as much value.

“To just digitize isn’t enough. How do we change those work processes? How do we inject more efficiency into work processes to take advantage of the technological advancements you are already investing in? That’s the special sauce,” he says, conceding that it’s difficult because people typically prefer routine and structure. “That’s probably got a lot to do with the lack of real speed of adoption, because you still have to overcome the way you’ve always done it.”

Warren says a good way to look at it is like a more nuanced version of the standard continuous improvement initiatives companies have been undertaking for decades.

“Continuous improvement is incremental changes over time, where digital transformation provides at least an impetus for more of a step change in the way we perform work, whatever that work might be.”

What’s old is new again

One of the main points of hesitation towards full scale implementation of digital transformation or AI initiatives is the perceived newness of it and the uncertainty or risk associated with the perception of so-called “bleeding edge” technology.

The thing is, none of this is all that new. The concept of the neural network was developed in the 1940s and Alan Turing introduced his influential Turing Test in 1950. The first AI programs were developed in the early 1960s. If you are a chess enthusiast, you’ve certainly played against AI opponents for the last 20 years. Most popular video games have had story lines fuelled by AI-powered non-player-characters (NPCs) for almost as long.

What has changed over the last few decades is the amount of computing power available, the democratized access to that compute power through the cloud, and the speed provided by the latest advances in chips.

This growth of available computational power and technology can now be applied to all the improvements organizations have been trying to achieve with continuous improvement. And they are proving to be most effective when combined with the extensive knowledge found within companies.

“Industry definitely provides complexities because it’s not just AI and machine learning (ML). There’s also domain knowledge, so it’s really a hybrid approach,” says Claudia Chandra, chief product officer for Honeywell Connected Industrials based in San Francisco.

Chandra earned a Ph.D. in artificial intelligence and software engineering from UC Berkeley 25 years ago and has spent her career working with data, AI, edge platforms and analytics.

“I’m not for just AI/ML on its own. It’s really the domain knowledge that needs to be incorporated along with (AI’s) first principles. The accuracy would not be there without that combination, because data alone won’t get you there,” Chandra said.

“That tribal knowledge needs to be codified, because that gets you there faster and might complement what’s in the data. So, digitization is the precursor to AI/ML—you need to collect the data first in order to get to AI/ML,” she says, reiterating that it must be a step-by-step process to reduce risk.

Chandra says companies that have taken these incremental steps towards digitalization and embrace the cloud or even more advanced tech such as AI/ML will find that their digital transformation is no longer a behemoth with all the pain and risk that go with it. Plus, any vendor with a good understanding of the technology will provide at least a starting point—including pre trained models—so companies don’t have to start from scratch. “But ultimately, as you train it more, as you use it more, it will get better with the data that’s specific to your company,” she says.

Certainly, the success of any AI-enabled digital transformation initiative is all about the underlying data and training the AI appropriately to get the required accuracy. But it takes several steps to set the conditions for value generation: Commit to a project; start small with the right use case; and be persistent and diligent with the data. Once you get a small victory, put the value and the experience towards the next project. With such an approach, you will soon learn why AI and Industry 5.0 are here to stay—and so will your competition.

The post AI and Industry 5.0 are definitely not hype appeared first on Engineering.com.

]]>