Digital Transformation - Engineering.com https://www.engineering.com/category/technology/digital-transformation/ Thu, 18 Sep 2025 13:55:25 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png Digital Transformation - Engineering.com https://www.engineering.com/category/technology/digital-transformation/ 32 32 This low-tech tool sharpens your digital transformation strategy https://www.engineering.com/this-low-tech-tool-sharpens-your-digital-transformation-strategy/ Thu, 18 Sep 2025 11:12:00 +0000 https://www.engineering.com/?p=143021 It's ironic that the Engineering Services DX Assessment Tool, a simple instrument developed at the University of Waterloo, is low-tech as it gets.

The post This low-tech tool sharpens your digital transformation strategy appeared first on Engineering.com.

]]>
Charlie Patel’s family had been providing engineering services to manufacturing companies in Ontario for the past 75 years. Over that period, technological advance had taken place and resulted in improvements in many aspects of their clients’ activities. These technology changes had occurred at a pace that Charlie’s company had been able to adapt to without too much difficulty – today was different.

Charlie is considering what his company’s response should be to today’s rapid technological change, including what they should do about artificial intelligence. He was hearing about the revolutionary impact AI would have on what seemed a daily basis. He knew that new companies were becoming established that provided new technology-based manufacturing engineering services and he wanted to ensure that this did not result in a reduction in the work done by his company. Charlie needed to understand the impact that these new technologies would have on his business and what his strategy should be to deal with it.

Digital transformation is the response that organisations are making to the Fourth Industrial Revolution (the world created by the rapid technological advance that is taking place today). This can mean changes in products, services and processes throughout the organisation. It can be small or large scale, radically changing business models and it means new technologies are being introduced throughout organisations with significant implications for engineering services organisations.

Engineering services are impacted by the new environment they need to support and the new tools that are available to them to do this. A wide range of technologies may be adopted in the organisation in areas that are within the scope of the work normally done by engineering services. These might include the development, implementation and support for new technology enabled processes, new automation and new decision-making systems that may or may not utilise artificial intelligence.

At the same time, technology is changing the support engineering services provide. More data is being collected and better tools exist that can be used in predictive maintenance services. New design tools enable faster, better design, some aspects of service delivery can be automated, artificial intelligence can be used for analytics, digital twins and simulation support analytics, design and management and big data can provide valuable insights.

These developments in the environment engineers support and in the tools available to them are the main factors influencing the digital transformation of engineering services. They make it essential that engineering services organisations carefully review their current situation and develop their own strategy for dealing with it. Otherwise they will be vulnerable to other providers who emerge, better prepared for the new environment.

The Engineering Services DX Assessment Tool is a simple instrument that we have developed at the University of Waterloo to help engineering services organisations consider and plan their own digital transformation. It is intended to be facilitative – prepared on a white board or flip chart by a group of engineers. Here is a blank version:

The tool is suitable for engineering services companies and engineering services within an existing organisation. It asks you to consider the following elements:

Client/Dept: The main organisations or units the engineering services are provided to. For an engineering services company this may be their main clients or client types if they are larger. For internal engineering services this would be the units they provide services to.

Change Elements: The changes in the Client/Dept that are or will be impacted by new technology. This may be specific performance improvements, equipment changes, process changes etc.

Main Tech: The main technologies being used in the change elements. This may include internet of things, artificial intelligence, digital twins, automation etc.

Implement Support: The support needed to implement the change described in the Change Elements column, such as design work, project management, impact assessment, etc.

Operate Support: The support needed to operate the change described in the Change Elements column, such as maintenance, education and training, and performance improvement.

Impact Services Now: Can your existing services provide the Implement and Operate Support that the Change Elements need now or are changes required to do this? Include here any areas of your services that may have been used by the client in the past but are not now needed due to the Change Element.

Action Needed: Review the information you have entered in this row of the chart and determine the actions that you need to take to deliver the support that your Client/Dept will require.

Charlie has completed the tool for his company, in this example:

The Engineering Services DX Assessment Tool allows you to consider the actions you might wish to take to ensure your organisation is able to continue to effectively provide engineering support. Once the chart is completed you can then consider the areas that will be your priority and become the main elements in your digital transformation strategy.

This strategy must include the impact that the Client/Dept changes will have on the skills of the members of your engineering team, along with any personnel changes you may need to make. The Client/Dept changes will require engagement and collaboration with stakeholders by engineers, utilising social skills more frequently than in the past due to the more rapid pace of change.

It should also include consideration of the technology-based tools that your team uses today (for example data analytics, simulation etc.) and investments in any new tools that may be appropriate here.

Developing and implementing your digital transformation strategy for Engineering Services is essential today. As Client/Dept organisations plan and implement their own digital transformation strategies they will consider the role existing engineering services providers can play. Be prepared with your own digital transformation strategy.

The post This low-tech tool sharpens your digital transformation strategy appeared first on Engineering.com.

]]>
Register for Digital Transformation Week 2025 https://www.engineering.com/register-for-digital-transformation-week-2025/ Tue, 09 Sep 2025 00:54:14 +0000 https://www.engineering.com/?p=142714 Engineering.com’s September webinar series will focus on how to make the best strategic decisions during your digital transformation journey.

The post Register for Digital Transformation Week 2025 appeared first on Engineering.com.

]]>
Digital transformation remains one of the hottest conversations in manufacturing in 2025. A few years ago, most companies approached digital transformation as a hardware issue. But those days are gone. Now the conversation is a strategic one, centered on data management and creating value from the data all the latest technology generates. The onrush of AI-based technologies only clouds the matter further.

This is why the editors at Engineering.com designed our Digital Transformation Week event—to help engineers unpack all the choices in front of them, and to help them do it at the speed and scale required to compete.

Join us for this series of lunch hour webinars to gain insights and ideas from people who have seen some best-in-class digital transformations take shape.

Registrations are open and spots are filling up fast. Here’s what we have planned for the week:

September 22: Building the Digital Thread Across the Product Lifecycle

12:00 PM Eastern Daylight Time

This webinar is the opening session for our inaugural Digital Transformation Week. We will address the real challenges of implementing digital transformation at any scale, focusing on when, why and how to leverage manufacturing data. We will discuss freeing data from its silos and using your bill of materials as a single source of truth. Finally, we will help you understand how data can fill in the gaps between design and manufacturing to create true end-to-end digital mastery.

September 23: Demystifying Digital Transformation: Scalable strategies for Small & Mid-Sized Manufacturers

12:00 PM Eastern Daylight Time

Whether your organization is just beginning its digital journey or seeking to expand successful initiatives across multiple departments, understanding the unique challenges and opportunities faced by smaller enterprises is crucial. Tailored strategies, realistic resource planning, and clear objectives empower SMBs to move beyond theory and pilot phases, transforming digital ambitions into scalable reality. By examining proven frameworks and real-world case studies, this session will demystify the process and equip you with actionable insights designed for organizations of every size and level of digital maturity.

September 24, 2025: Scaling AI in Engineering: A Practical Blueprint for Companies of Every Size

12:00 PM Eastern Daylight Time

You can’t talk about digital transformation without covering artificial intelligence. Across industries, engineering leaders are experimenting with AI pilots — but many remain uncertain about how to move from experiments to production-scale adoption. The challenge is not primarily about what algorithms or tools to select but about creating the right blueprint: where to start, how to integrate with existing workflows, and how to scale in a way that engineers trust and the business can see immediate value. We will explore how companies are combining foundation models, predictive physics AI, agentic workflow automation, and open infrastructure into a stepped roadmap that works whether you are a small team seeking efficiency gains or a global enterprise aiming to digitally transform at scale.

September 25: How to Manage Expectations for Digital Transformation

12:00 PM Eastern Daylight Time

The digital transformation trend is going strong and manufacturers of all sizes are exploring what could be potentially game-changing investments for their companies. With so much promise and so much hype, it’s hard to know what is truly possible. Special guest Brian Zakrajsek, Smart Manufacturing Leader at Deloitte Consulting LLP, will discuss what digital transformation really is and what it looks like on the ground floor of a manufacturer trying to find its way. He will chat about some common unrealistic expectations, what the realistic expectation might be for each, and how to get there.

The post Register for Digital Transformation Week 2025 appeared first on Engineering.com.

]]>
How small language models can advance digital transformation – part 1 https://www.engineering.com/how-small-language-models-can-advance-digital-transformation-part-1/ Thu, 04 Sep 2025 17:33:09 +0000 https://www.engineering.com/?p=142603 Comparing the characteristics of SLMs to LLMs for digital transformation projects.

The post How small language models can advance digital transformation – part 1 appeared first on Engineering.com.

]]>
Small language models (SLMs) can perform better than large language models (LLMs). This counterintuitive idea applies to many engineering applications because many artificial intelligence (AI) applications don’t require an LLM. We often assume that more information technology capacity is better for search, data analytics and digital transformation.

SLMs offer numerous advantages for small, specialized AI applications, such as digital transformation. LLMs are more effective for large, general-purpose AI applications.

Let’s compare the characteristics of SLMs to LLMs for digital transformation projects.

SLM vs. LLM focus

SLMs are efficient, domain-specific AI models optimized for tasks that can run on smaller devices using limited resources. LLMs are powerful, general-purpose AI models that excel at complex tasks but require substantial computing resources.

SLMs are explicitly designed for small domain-specific tasks, such as digital transformation, which is critical to the work of engineers. SLMs offer high accuracy for niche AI applications. LLMs, on the other hand, are trained on enormous datasets to enable them to respond to a wide range of general-purpose tasks. LLMs sacrifice accuracy and efficiency to achieve general applicability.

Comparing language model characteristics

SLMs are quite different from LLMs, despite their similar names. Engineers can use these language model characteristics to determine which language model best fits the characteristics of their digital transformation project.

See footnotes at the end of the story for a glossary of the above terms.

Considering data privacy support

Data privacy is a significant issue for digital transformation projects because the internal data being transformed often contains intellectual property that underlies the company’s competitive advantage.

Support for data privacy depends on where the SLM or LLM is deployed. If the AI model is deployed on-premise, data privacy can be high if appropriate cybersecurity defences are in place. If the SLM or LLM is deployed in a cloud data centre, data privacy varies depending on the terms of the cloud service agreement. Some AI service vendors state that all end-user prompts will be used to train their AI model further. Other vendors commit to not using the provided data. If engineers are unsure that the vendor can meet its stated data privacy practices or if those practices are unacceptable, then implementing the AI application on-premises is the only course of action.

SLMs offer many advantages for digital transformation projects because these projects use domain-specific data. LLMs are more effective for large, general-purpose AI applications that require vast data volumes. In the follow-on article, we’ll discuss the differences between SLMs and LLMs for construction and operation.

Footnotes: AI Glossary

  1. Domain knowledge is knowledge of a specific discipline, such as engineering or digital transformation, in contrast to general knowledge.
  2. Parameters are the variables that the AI model learns during its training process.
  3. Contextual relevance is the ability of the AI model to understand the broader context of the prompt text to which it is responding.
  4. Curated proprietary domain-specific data is typically internal data to the organization. Internal data is often uneven or poor in quality. It is often a constraint on the value that AI applications based on an SLM can achieve. Improving this data quality will improve the value of AI applications.
  5. Accurate output is essential to building confidence and trust in the AI model.
  6. The accuracy of LLM output is undermined by the contradictions, ambiguity, incompleteness and deliberately false statements found in public web data. LLM AI output is better for English and Western societies because that’s where most of the web data originates.
  7. Bias refers to incidents of biased AI model output caused by human biases that skew the training data. The bias leads to distorted outputs and potentially harmful outcomes.
  8. Hallucinations are false or misleading AI model outputs that are presented as factual. They can mislead or embarrass. They occur when an AI model has been trained with insufficient or erroneous data.
  9. Prompts are the text that end-users provide to AI models to interpret and generate the requested output.

The post How small language models can advance digital transformation – part 1 appeared first on Engineering.com.

]]>
How digital transformation systems track the lifecycle of materials and equipment https://www.engineering.com/how-digital-transformation-systems-track-the-lifecycle-of-materials-and-equipment/ Mon, 18 Aug 2025 20:34:43 +0000 https://www.engineering.com/?p=142184 Digital transformation systems have become indispensable tools for tracking the lifecycle of materials and equipment in manufacturing.

The post How digital transformation systems track the lifecycle of materials and equipment appeared first on Engineering.com.

]]>
In the manufacturing industry, tracking the lifecycle of materials and equipment is critical for ensuring product quality, operational efficiency, compliance, and cost management.

Digital transformation—the integration of digital technology into all areas of business—has revolutionized how manufacturing companies manage this task. By leveraging technologies such as IoT, ERP, PLM, RFID, blockchain, digital twins, and AI-driven analytics, manufacturers can gain comprehensive visibility into the lifecycle of every material and asset in their operations.

Lifecycle tracking definitions and objectives

For this discussion, the term material lifecycle includes all stages from procurement, receiving, inventory management, production usage, waste or recycling, and compliance documentation. Whereas the equipment lifecycle involves procurement, installation, usage, maintenance, inspection, upgrades, and decommissioning.

The desired outcomes from tracking material and equipment haven’t changed, only the way we track them. Reducing downtime and waste, improving traceability and compliance, optimizing resource use and enhancing forecasting and decision-making are all still the important goals. With the right mix of digital technologies, making correct decision to reach these goals will be a little easier.

Core technologies driving digital lifecycle tracking

Enterprise Resource Planning (ERP) Systems: ERP systems centralize and standardize data related to procurement, inventory, production, maintenance, and finance. They act as the backbone for lifecycle data management. An ERP suite will handle all sorts of tasks, including bill of materials (BOM) management; work order tracking; asset management; and integration with procurement and supply chain functions.

Product Lifecycle Management (PLM) Systems: PLM systems centralize and standardize data related to product design, development, engineering changes, and compliance. They act as the backbone for managing product information across its lifecycle. A PLM suite will handle all sorts of tasks, including CAD data management; version and change control; bill of materials (BOM) structuring; and integration with engineering, manufacturing, and quality processes.

Internet of Things (IoT): IoT sensors embedded in equipment or in the factory environment provide real-time telemetry data, such as temperature, vibration, pressure, and operating time. These sensors monitor equipment health and usage; ensure proper storage conditions for sensitive materials; and help automate maintenance schedules. Edge computing (the data processing near machines for faster decisions, reduced latency, and improved efficiency) enables pre-processing this data by the device/sensor to reduce latency and bandwidth costs.

RFID and Barcode Tracking: RFID tags and 1D/2D barcodes allow automated identification and tracking of materials and equipment across facilities. This tech can track real-time inventory updates; automate check-in/check-out systems; and audit trails for material handling. RFID is particularly beneficial for high-value or mobile assets, reducing human error and labor costs.

Digital twins: A digital twin is a virtual representation of a physical asset or process. It uses real-time data to simulate, monitor, and analyze the condition and behavior of the asset. This technology is currently being used for predictive maintenance; root cause analysis; and equipment lifecycle visualization. Digital twins integrate with IoT platforms, ERP, PLM and CAD systems, creating a multi-source feedback loop for continual improvement.

AI and Analytics Platforms: Machine learning models analyze lifecycle data to predict equipment failure, optimize material usage, and improve production planning. These aren’t new, per se, but are now being applied to all sorts of situations in manufacturing companies, such as anomaly detection in sensor data; forecasting inventory needs; and identifying underperforming assets, equipment or suppliers. AI powered analytics platforms often integrate with ERP or MES (Manufacturing Execution Systems) to generate actionable insights.

Lifecycle tracking workflows

Material lifecycle tracking begins at procurement, where ERP systems automatically generate purchase orders based on demand forecasts. Upon delivery, RFID tags or barcodes on materials are scanned and matched against purchase orders. Relevant data—such as supplier, batch number, and date—is logged into the system for traceability. In the storage and inventory phase, IoT sensors monitor warehouse conditions, triggering automated alerts if environmental parameters like temperature or humidity deviate from set thresholds. Materials are organized based on criteria such as shelf life, usage priority or regulatory guidelines.

During production, materials are scanned into batches, creating a digital link between raw materials and finished goods for full traceability. Waste generated is tracked and categorized (e.g., recyclable, hazardous) to support sustainability goals. After production, unused materials are either returned to inventory or flagged for disposal. All associated data is stored in the ERP system and, optionally, on blockchain networks for enhanced auditability and compliance.

Equipment lifecycle tracking follows a similar digital framework. Upon procurement, equipment records are entered into the ERP or an asset management system, and a digital twin is initialized using the equipment’s baseline configuration. During use, IoT sensors continuously collect operational data, which is analyzed using machine learning to detect early signs of wear, anomalies, or potential failures. This enables predictive maintenance strategies, with the ERP or CMMS (Computerized Maintenance Management System) automatically generating and assigning work orders. Maintenance history is logged and linked to each asset’s digital twin for a comprehensive performance record.

At the end of an asset’s useful life, the system flags it for decommissioning when performance drops beyond acceptable levels. Relevant disposal or recycling data is recorded for regulatory compliance, and the asset is removed from active digital systems.

Integration and interoperability across these systems are crucial. Manufacturers often use middleware or integration platforms—such as MuleSoft or Apache Kafka—to link ERP systems with MES (Manufacturing Execution Systems), IoT platforms, and other operational tools. Interfacing RFID/barcode systems with inventory software and connecting digital twins to PLM (Product Lifecycle Management) tools ensure a unified data ecosystem. APIs, data lakes, and standardized data formats like OPC UA, JSON, and XML facilitate seamless, consistent data exchange.

Security and data governance are foundational to digital lifecycle tracking. Because these systems manage sensitive operational and supply chain data, robust cybersecurity practices are essential. This includes role-based access control (RBAC), encryption of data at rest and in transit, regular vulnerability assessments, and compliance with international standards such as ISO 27001, NIST, and GDPR. Blockchain technology can further enhance data integrity by creating tamper-resistant audit trails, while cloud platforms (e.g., Azure, AWS, Google Cloud) offer scalable, secure infrastructure for data storage and processing.

The benefits of digital lifecycle tracking span operational, financial, and regulatory domains. Operationally, it reduces downtime through predictive maintenance, improves inventory accuracy, and increases throughput via automation. Financially, it lowers operational costs, reduces waste and overstock, and enhances asset utilization and return on investment. From a regulatory standpoint, it simplifies audits and compliance reporting through end-to-end traceability and standardized documentation practices.

Digital transformation systems have become indispensable tools for tracking the lifecycle of materials and equipment in manufacturing. By integrating ERP, IoT, AI, RFID, and other technologies, manufacturers can gain real-time visibility, improve operational efficiency, and ensure regulatory compliance. As these technologies mature, the next evolution lies in greater automation, AI decision-making, and more resilient supply chains, all driven by data-rich, digitally connected environments.

The post How digital transformation systems track the lifecycle of materials and equipment appeared first on Engineering.com.

]]>
How are engineers using spatial computing? https://www.engineering.com/how-are-engineers-using-spatial-computing/ Thu, 14 Aug 2025 14:30:00 +0000 https://www.engineering.com/?p=141799 AR, VR, and MR are increasingly valuable as tools for visualization, collaboration, presentations and more.

The post How are engineers using spatial computing? appeared first on Engineering.com.

]]>
While you probably already have a device that can access augmented reality (AR), you can’t visit virtual reality without a VR headset. But there are plenty of options to choose from, ranging from consumer-targeted products for a few hundred dollars to enterprise VR headsets that provide better resolution and responsiveness but cost thousands of dollars. Some VR headsets are self-contained computers with internal processors, but others depend on a connection to a GPU-equipped engineering workstation.

Of course, the hardware alone doesn’t get you very far. Software with support for spatial computing is growing by the day. Some CAD programs support VR directly, allowing designers to easily switch to a virtual view of their models. Other software caters to VR design reviews with features for collaboration and markup. Game engine software, sometimes called real-time 3D software, can be used to develop custom AR or VR experiences using existing CAD models.

Though there’s an upfront cost to getting started with spatial computing—both in the price of headsets and software as well as the learning curve for users—for many engineers, the cost is well worth it. VR provides an unparalleled way to visualize and refine a design, and has thus become a part of many engineering workflows.

In this article, we’ll look more closely at the main ways engineers, architects, manufacturers and others are using spatial computing.

Visualization and collaboration

By strapping on a mixed reality (MR) headset or peering through an AR-capable phone or tablet, engineers can see their work as if it were in the real world. This isn’t a gimmick—engineers, after all, design products for the real world, and so visualizing it in place rather than on a computer monitor is an obvious advantage.

Besides the depth and perspective that spatial computing provides to visualization, another important benefit is scale. An engineer designing a small bracket could see that product in life size using a screen, but anything bigger than a computer monitor would be an exercise in imagination (even the most multi of monitor setups can’t fit a car, airplane or space station). With spatial computing, all designers have the opportunity to visualize their designs at scale.

This spatial computing benefit is most readily apparent to designers of, unsurprisingly, spaces. Architects, for instance, can use VR to virtually walk through their building designs, achieving a sense of the space that’s simply impossible through traditional computing. This walkthrough need not be limited to a static showcase, either. Inside VR, designers have the opportunity to make changes on a whim. Don’t like that material finish, that lighting, that façade? A few clicks of your VR controller and you can see it in any number of options. Want to change the time of day, the season of the year, the weather? Go for it. You’re in a virtual world; you control every aspect of it.

Regardless of what you’re designing, spatial computing gives you the ability to visualize it more realistically than ever. But you don’t have to be alone in your virtual world. Another big advantage of spatial computing is that it provides a three-dimensional meeting space. Putting the two together, VR gives engineering teams a way to conduct virtual design reviews with participants from around the globe. In these virtual meeting rooms, participants—who are often represented by virtual avatars—can walk around, talk about, and review 3D models as if they were evaluating a real prototype. Not only does this save the costs of manufacturing and travel, it allows engineering teams to iterate faster and develop better end products.

Factory planning, maintenance and optimization

There are many ways that engineers can use spatial computing for manufacturing. In the same way that an architect can walk through a virtual building, a factory planner can use VR to see a virtual layout of their facility. This realistic and immersive visualization allows them to not just see but experience problems, such as machinery collisions or insufficient spacing, that might otherwise go undetected.

Spatial computing also provides the opportunity to simulate how factory workers interact with the environment, a crucial step for optimizing ergonomics. Even if a process seems fine on a computer monitor, stepping into a VR version of it would make it readily apparent that workers would have to, say, bend down too much to grab the next component. It’s a fix that’s all the simpler for catching it in advance.

AR and VR can both improve the process of equipment maintenance as well. This might take the form of an augmented video call between a worker and an off-site maintenance technician, who could annotate a piece of equipment from afar while the worker sees the notes, in place on the equipment, through an AR-enabled tablet. The technician might have learned about the equipment from a VR manual, virtually taking it apart and putting it back together.

Another popular use case for VR is for operator training, as it provides an unparalleled platform to simulate different scenarios. This could be used to train workers on their core job and beyond. For example, a VR simulation of a factory fire or hazardous spill could get every employee viscerally comfortable with emergency procedures.

Presentations and marketing

The immersive experience of spatial computing is a natural fit for showing off your product, whether internally or externally. For the same reasons that engineers and architects enjoy VR for design visualization and collaboration, the technology is a great option for presenting product concepts to others within an organization. Sketches and renders are nice, but they don’t beat life-like representation in a real environment.

Similarly, consumers increasingly appreciate—in some categories, even expect—spatial computing models that they can try on at home, so to speak. This is particularly common for products with lots of aesthetic variation, such as furniture. It may be difficult to pick out the perfect sectional in a brightly-lit showroom, but if you were able to compare options in your own home, the choice would be much easier. All you need is an AR-capable phone—and for the manufacturer to give you an AR option on their website. Combined with configuration tools, spatial computing gives potential customers the most convincing and personalized sales pitch you could imagine.

The post How are engineers using spatial computing? appeared first on Engineering.com.

]]>
How digital transformation and remote monitoring drive sustainability in manufacturing https://www.engineering.com/how-digital-transformation-and-remote-monitoring-drive-sustainability-in-manufacturing/ Wed, 13 Aug 2025 17:55:34 +0000 https://www.engineering.com/?p=142081 Sustainability is no longer a peripheral concern; it’s a strategic and financial imperative.

The post How digital transformation and remote monitoring drive sustainability in manufacturing appeared first on Engineering.com.

]]>
Regulatory pressure, stakeholder expectations, and rising energy costs have made environmental stewardship critical to long-term success. For manufacturing engineers, this presents both a challenge and an opportunity: How can operations become more resource-efficient without compromising productivity?

The answer increasingly lies in digital transformation systems — specifically, in the deployment of remote monitoring technologies that turn real-time data into actionable sustainability improvements. From energy and water efficiency to predictive maintenance and emissions tracking, these technologies are reshaping how manufacturers optimize resource use and reduce their environmental footprint.

Indeed, the path to sustainable manufacturing runs through data — and remote monitoring is the bridge.

What is remote monitoring?

Remote monitoring involves using Internet of Things (IoT) sensors, embedded systems, and cloud platforms to continuously collect and analyze data from equipment, utilities, and environmental systems across a facility. This data is centralized through manufacturing execution systems (MES), enterprise resource planning (ERP) software, or dedicated building management systems (BMS).

Instead of relying on manual checks, logbooks, or periodic audits, engineers and facility managers get real-time visibility into performance metrics — allowing them to make faster, more informed decisions that directly impact sustainability.

Energy efficiency through real-time monitoring

Energy use is one of the biggest drivers of cost and carbon emissions in manufacturing. Remote monitoring enables a granular view of energy consumption across assets and zones, revealing exactly where, when, and how energy is being used or wasted.

Smart meters and sub-meters connected to a centralized dashboard can identify all sorts of conditions on the shop floor and beyond, including:

  • Idle equipment that’s consuming power during off-hours
  • HVAC systems operating outside of optimal temperature ranges
  • Lighting systems left on in unoccupied zones
  • Peak load times where demand charges can be minimized

By linking this data with control systems, manufacturers can automate load balancing, schedule equipment operations, and even initiate demand-response actions in coordination with utility providers. This reduces both energy costs and greenhouse gas emissions.

Water conservation and waste reduction

Water plays a crucial role in many manufacturing processes — from cooling and cleaning to production itself. However, leaks, inefficiencies, and overuse are common and costly. Remote monitoring helps tackle this by using flow sensors, pressure gauges, and smart valves to track water use in real time. Cooling systems can be optimized to reduce unnecessary water cycling and smart alerts can be triggered by unexpected consumption spikes, pointing to leaks or process failures. Usage trends can be analyzed to adjust cleaning cycles or reuse treated wastewater.

In plants with on-site wastewater treatment, remote monitoring can ensure compliance with discharge limits and optimize treatment operations, minimizing environmental impact while reducing chemical and energy usage.

Predictive maintenance and asset efficiency

We’ve covered this a lot in this series—for a reason. One of the most effective ways to reduce waste and energy consumption is to keep machinery operating at peak efficiency. With remote condition monitoring, engineers can track vibration, temperature, current draw, and operational hours of key equipment in real time.

Environmental monitoring and emissions tracking

Modern manufacturing operations are under pressure to reduce air emissions, particulate output, and volatile organic compounds (VOCs). Remote monitoring plays a vital role in tracking these metrics through ambient sensors, gas analyzers, and stack monitors connected to cloud systems.

These systems provide continuous emissions reporting for regulatory compliance and early warnings when emissions approach critical thresholds. They also maintain historical data used for environmental, social, and governance reporting.

This not only keeps operations within legal bounds but also supports a proactive approach to pollution prevention, enabling facilities to fine-tune combustion systems or ventilation processes based on real-time feedback.

As more facilities adopt on-site renewable energy—be it solar, wind, or combined heat and power (CHP)—managing the variability and integration of these sources becomes essential. Remote monitoring allows for dynamic balancing of solar generation output versus real-time load, battery storage availability and grid draw during peak versus off-peak hours.

This maximizes the use of clean energy, reduces fossil fuel dependency, and lowers emissions associated with energy use. In some cases, surplus energy can be fed back into the grid or redirected to storage systems, enhancing sustainability while reducing operating costs.

Digital twins and process optimization

Beyond monitoring individual systems, the technology and processes involved in digitization and digitalization (which combine to form the basis of digital transformation) enable the creation and application of digital twins of production lines or facilities. By integrating real-time monitoring data into these simulations, engineers can model energy and resource usage under different production scenarios and test any process changes before implementing them physically. This can identify optimal settings for production that also reduce scrap, cycle time, or energy used per unit produced

This capability is powerful for continuous improvement and sustainability planning, allowing facilities to adapt quickly to new customers or product mixes.

The engineering advantage

For manufacturing engineers, the integration of remote monitoring technologies into digital transformation strategies isn’t just a sustainability move — it’s a smarter way to run a business. These systems deliver granular, real-time insights that enable better decisions, faster response, and long-term efficiency.

As sustainability becomes more tightly linked to profitability, risk management, and brand reputation, engineers who understand and embrace these technologies will be best positioned to lead their organizations into a more resource-efficient and environmentally responsible future.

The post How digital transformation and remote monitoring drive sustainability in manufacturing appeared first on Engineering.com.

]]>
How software is redefining sustainable building engineering https://www.engineering.com/how-software-is-redefining-sustainable-building-engineering/ Wed, 13 Aug 2025 16:44:51 +0000 https://www.engineering.com/?p=142083 Digital platforms and emerging AI tools are bringing buildings to life.

The post How software is redefining sustainable building engineering appeared first on Engineering.com.

]]>
In Milan’s Porta Nuova district, a vast, once-derelict rail yard has been transformed into one of Europe’s most advanced urban regeneration projects. Powered by geothermal pumps and covered in photovoltaic panels, buildings like Gioia 22 and Pirelli 35 are more than just energy efficient, they are software-defined environments where digital systems, sensors, and AI models continuously monitor and manage performance.

“The building must be alive,” Claudia Guenzi, head of smart infrastructure for Siemens in Italy, said at a recent press conference in Milan. “That means understanding and controlling its behaviour in real time. And we can’t just keep digging up grids. Software is the only scalable answer.”

It’s an important point, especially in areas with old infrastructure and legacy technologies to consider. Software may promise smarter, more efficient systems and buildings, but it remains a challenge to get it right. Without clear incentives, inclusive design, and a culture that understands its role and purpose, its value risks going unrealized.

Claudia Guenzi, head of smart infrastructure for Siemens in Italy. (Image: Siemens.)

As Carlo Ratti, director of the MIT Senseable City Lab and founding partner at Carlo Ratti Associati, told Engineering.com, “data is everywhere but insight is rare. Building owners collect huge volumes of information on energy use, occupancy patterns, and maintenance. Yet much of it goes unanalysed and unused. The issue isn’t a lack of data, but a lack of interpretation.”

This is where developers like Coima come in. Coima is working with Siemens at Porta Nuova, helping to deliver a building management system that integrates HVAC, fire safety, intrusion detection, and electrical systems into a unified platform. According to Siemens, energy use in Gioia 22 has since been cut by 75%, avoiding over 2,200 tonnes of CO2 emissions annually.

Data is now defining infrastructure

Yet even where the technology has demonstrably improved performance, there are signs that not all tenants fully understand or use the tools at their disposal. As Stefano Corbella, Coima’s sustainability officer, told Engineering.com, “most people who own buildings don’t know the data in their buildings, and it’s a shame. Because without data, you cannot manage properly.”

Kas Mohammed, VP of digital energy at Schneider Electric UK and Ireland. (Image: Schneider Electric.)

Kas Mohammed, VP of digital energy at Schneider Electric UK and Ireland, sees this dynamic all the time. “Some are [using the data], but there’s still a gap between collecting data and acting on it,” he told Engineering.com. “The best results come from collaboration. Data is powerful, but it’s even more valuable when combined with real-world feedback from the people using the space.”

So is software really eating infrastructure? Or is the infrastructure simply being upgraded to speak software’s language?

Either way, digital systems are no longer just supporting the built environment, they’re starting to define how it behaves, performs, and evolves over time. According to Nemetschek Group’s Jimmy Abualdenien, who is charge of the company’s digital twins, this transformation is not just necessary, it’s overdue.

“Digital twins and smart building platforms are not just justified by their operational and environmental costs, they are essential for achieving net zero targets,” Abualdenien told Engineering.com. “By integrating real-time data from IoT sensors and leveraging AI, digital twins provide actionable insights that optimise energy use, reduce carbon emissions, and extend asset lifecycles.”

Still, even among advocates, there is caution. Abualdenien notes that the biggest challenges lie not in the tools themselves, but in the systems that surround them—data silos, fragmented standards, and a lack of interoperability.

His call for open standards and collaborative workflows echoes a broader industry concern that the rush to digitize may create new forms of lock-in or technical debt. This is particularly relevant as building technologies increasingly resemble software stacks, where decisions made at the design stage can affect flexibility and viability for decades to come.

Ratti, of the MIT Senseable City Lab, goes further. “There’s no one-size-fits-all answer,” he told Engineering.com. “It depends on how, and for what, you use [these systems]. Let me use an analogy and take generative AI, for example. It can help optimise energy production and reduce emissions, paying itself back by many orders of magnitude. But if you’re using it to generate meaningless anime videos, the environmental cost is hardly justified.”

The Pirelli 35 building in Milan’s Porta Nuova district. (Image: Siemens.)

Ratti, who says he is already exploring many of these trade-offs through experimental projects at the 2025 Biennale Architettura in Venice, believes that intelligence alone isn’t enough. Purpose matters. So does ownership. So does design.

Empower engineers rather than dictate

That message is echoed by Arup’s Lindsay English, associate principal and leader of Americas Digital Rail, and John Hagerty, an associate leading digital master planning and smart buildings. For them, software is part of a toolkit, a means to an end, not the end itself.

“The toolkit is there to enhance the work of the engineers,” English told Engineering.com. “Yes, return on investment is important. But there are other outcomes that matter too. Reducing risk, improving project quality, helping engineers visualise interdependencies earlier.”

That shift from static models to dynamic systems thinking is already underway in Arup’s rail buildings work. English points to a project where a digital twin strategy wasn’t just designed for operations and maintenance, but was used during construction to manage contractor coordination, track construction progress in real time, and evaluate the sustainability and cost impacts of design changes.

“Most of the cost savings come in operations. But instead of waiting, we used the twin to improve delivery, manage risk, and model outcomes across multiple dimensions,” said English.

Hagerty adds that one of the most pressing challenges is not the technology itself, but the organizational structures around it.

“Clients can usually find the funding,” Hagerty told Engineering.com. “But if they’re not structured to support these systems over time, with an internal champion, a plan for evolution, and alignment across teams, then they fall apart.” He cites the common scenario where a client invests in smart systems for a new flagship building but ignores the legacy estate that makes up most of its footprint.

The tendency to focus on new builds risks missing the bigger opportunity—retrofitting the systems we already have. Here too, interoperability becomes a sticking point.

“Always avoid vendor lock-in,” said English. “We don’t know what the future will hold, but we do know that assets can last 100 years. You want a foundation that’s flexible enough to adapt.”

Expectations are changing

Mohammed, the VP at Schneider Electric, agrees. “In older buildings, facility managers often have to deal with separate, unconnected systems. This makes it hard to see what’s working well and what isn’t. But with modern BMS [battery management systems] and sensors, managers can quickly respond to how the building is being used and to changes in the environment,” he says.

Siemens takes a similar long-term view. “We do not consider the journey finished,” says Guenzi. “The technology is scalable and ready for future development.”

That foundation, increasingly, is data. But the value of data depends on its usability.

“I can’t tell you how many portfolio owners still rely on phone calls and clipboards to get the answers they need,” says Arup’s Hagerty.

The goal is not just a single source of truth, but a shared one, where engineers, operators, and tenants can all access and act on the same information. But even access isn’t enough. As Ratti puts it, data is everywhere, but insight is rare. The risk is that we end up designing for complexity rather than clarity. That risk only grows as AI becomes more embedded in design and operations. While tools like generative AI can dramatically accelerate information retrieval and automate workflows, they also raise questions about privacy, governance, and control.

Hagerty suggests the industry is just beginning to reckon with these implications.

“Clients are already asking what all this tech means for privacy. That’s going to become mainstream much faster than people think,” he said.

Carlo Ratti, director of the MIT Senseable City Lab and founding partner at Carlo Ratti Associati. (Image: World Economic Forum.)

Where does this leave the engineer? According to Hagerty, the role is evolving fast. “It used to be that a building engineer focused on mechanical systems. Now they’re fielding calls about IT infrastructure, cyber risk, and AI-driven control systems. The skillset is changing.”

So too are the expectations. As infrastructure becomes more software-defined, the traditional boundaries between architecture, engineering, and operations start to blur. For Arup, the answer lies in flexibility, in building foundations that support change rather than resist it. That may require new procurement models, new forms of governance, and a rethinking of value that goes beyond cost per square metre.

“Smart infrastructure shouldn’t be an optional extra,” said Hagerty. “It’s already part of most modern systems, whether people realise it or not. The question is whether we make those decisions thoughtfully and build systems we can live with in the long term.”

The future of infrastructure, then, may not be one where software simply eats the physical. It may be one where the physical and digital co-evolve, sometimes uneasily, often messily, but with an eye on what matters most. As Ratti reminds us, quoting Cedric Price, “technology is the answer. But what was the question?”

The post How software is redefining sustainable building engineering appeared first on Engineering.com.

]]>
Data landscapes and the product lifecycle https://www.engineering.com/data-landscapes-and-the-product-lifecycle/ Tue, 12 Aug 2025 18:33:39 +0000 https://www.engineering.com/?p=142053 The hidden life of data clutter in half-forgotten digital closets is coming to an end.

The post Data landscapes and the product lifecycle appeared first on Engineering.com.

]]>
The torrent of data and information flowing through organizations is relentless. Onward in variety, outward in reach and spread, and upward in quantity—all at higher velocity. All organizations have petabytes of data in countless forms and formats flowing ceaselessly into and through a complex ecosystem of applications, systems, and platforms.

A comprehensive new vision for this is emerging: the Data Landscape, a graphical ecosystem built with or extracted from all of the enterprise’s databases with state-of-the-art data mapping tools and solutions.

These tools and solutions can unlock, gather, track, manage, analyze, and use anything and everything digital in the enterprise, regardless of size and structure. And anything else, too, for that matter, such as images, text files, videos, CAD files, CAM toolpaths, analyses of many kinds, inspection specifics, and so on.  

The best of these tools can access virtually all of the enterprise’s data and metadata, even if it is buried in obsolete legacy formats and systems, or stashed in the “silos” maintained by nearly every department and business unit.

In short, the hidden life of data clutter in half-forgotten digital closets is coming to an end as it is gathered and mapped into Data Landscapes. What this can potentially do for your enterprise’s product development and product lifecycle management activities is significant, although it won’t be as easy as a finger snap.

I am always cautious about using the term “revolution,” but for now, it is the best description for what is happening in the world of data, information, and knowledge. We at CIMdata now see that Data Landscape mapping tools and solutions offer exponentially better ways for managing, securing, searching, finding, accessing, as well as extracting value out of the extended enterprise’s mountain of data.

Fortunately, these Data Landscape tools and solutions are rapidly becoming more effective and less difficult to use. And they foreshadow major implications for product lifecycle management (PLM) activities from concept through life, by:

  • Enabling an organization’s digital threads to bring more data into the digital twins to which they are connected.
  • Supporting more comprehensive and more detailed digital twins.
  • Extending end-to-end lifecycle connectivity to the entire Data Landscape of the extended enterprise, not just product lifecycles.    

These new Data Landscape tools are also beginning to upend time-tested processes for developing bills of materials (BOMs), which are the roadmaps of every new product’s creation, development, production, and service. Soon BOMs will be extended “back” to ideation and what I like to call “the voice of the customer” and forward through field service and on to recycling, remanufacturing, repurposing, or disposal.

There are also major product-development implications in Data Landscape mapping for digital transformation, as we now see that so much more has to be transformed and made usable from the earliest stages of the product lifecycle.

The very name “Data Landscape” points us towards a new approach to mapping, one that far exceeds the traditional conceptual, logical, and physical models that we use to align data with business strategies and goals. Mapping in this new form must cover all of the extended enterprise’s data and its sources, transformations, and destinations … profiling and cataloguing a digital graphical geography of data, applications, tools, systems, and so on.

Without Data Landscape mapping, most of the enterprise’s data remains inaccessible and otherwise useless, denying the enterprise the opportunity for more informed decisions, comprehensive analyses, and competitive products and services.

Defining the data landscape

What is in a Data Landscape? Anything and everything digital, structured or unstructured, formatted data, raw data, and anything in between. Data Landscapes contain sources, transformations, models, transactional databases, analyses (and analytical tools), programming languages, and everything else that anyone has saved. It’s far easier to say what’s not in a Data Landscape: in theory, nothing.

Unlocking new value

Data Landscapes are coming into focus as being capable of creating significant new value in product lifecycle management activities. Thanks to mapping and digital transformation, the possibilities include vastly expanded analyses for more profound product insights; savvier and speedier decision-making about products, features, and capabilities; better and higher quality production; and better end-to-end lifecycle support.

And all of this is constantly evolving, growing, and undergoing change, with updates, transformations, and sometimes even replacement—a powerful justification for implementing effective and ongoing data governance, as well as for frequent remapping.

For everyday use, these tools must be integrated with the enterprises’ dozens of technology stacks, which are used to create, collect, model, transform, store, and analyze data and information for specific purposes or business processes. The fact that the term “Data Landscape” itself is gaining attention shows that these tools work.

Dozens of tools for various mapping processes can be downloaded from solution providers in the Data Landscape marketplace—IBM and Microsoft, of course, as well as Orion Governance, FanRuan Software, Hyve Solutions, AtScale, Zendata, and many others.

In terms of PLM, what tools are we talking about?

In terms of PLM activities, however, what we connect Data Landscape mapping tools to is highly important. Aside from PLM and the toolsets in use for digital transformation, this means whatever is used to generate BOMs (with or without ERP), manufacturing execution systems (MES), purchasing (including interfaces to component suppliers, contractors, and partners), supply chain maintenance, and service oriented solutions (e.g., those focused on maintenance, repair, and overhaul.)

These connections foreshadow many alterations (and even a few upheavals) in the established processes of developing new products. As with any new technology, users and managers inevitably have to address some tough challenges, such as how to:

  • Choose which of the many mapping tools to implement
  • Connect and integrate those tools
  • Use them effectively
  • Evaluate those tools’ outputs, whether graphical or in some other form
  • Integrate these new tools’ outputs with the enterprise’s data already in use throughout the product lifecycle, both upstream and downstream.

The three pillars of a successful data landscape

To ensure that a Data Landscape and its mapping tools enhance PLM activities, three essential elements must be in place:

Training so that users understand the Data Landscape and its many subsystems and components, their uses, and their differences. Data Landscape websites list data warehouses, lakes and lakehouses, platforms, meshes, stacks, marts, and even swamps. A Data Landscape can be seen as the ultimate representation of data throughout the connected environment of an Internet of Things.

•  Adoption and use of artificial intelligence (AI), which is now part of everything digital, including Data Landscape mapping tools and systems—especially those using the enterprise’s Data Stacks. A well-mapped Data Landscape can be seen as the ultimate Large Language Model (LLM) to use with AI. The enormous amount and variety of digital data (and computer clutter) make it essential to know how to find and fix AI’s errors and its occasional hallucinations.

Data governance, which is essential for any worthwhile use of Data Landscapes. Data governance policies, standards, rules, and their supervision—always do the same few vital things. Amid complexity and constant, rapid change, data governance ensures security, quality, and integrity of data access and use. Data governance is crucial for avoiding failures in regulatory compliance, that come with the risk of often hefty fines and reputational damage that may take years to overcome.

Impacts of data landscape tools on PLM

As we observed at the outset, the maturation of digital tools to work with virtually all of the enterprise’s data presents, what might be, a once-in-a-generation opportunity to create more complete and more accurate BOMs and use them to overhaul the entire product lifecycle.

In effect, we will be gaining useful and reliable access to virtually all of the enterprise’s data and information, no matter what it is or where it is.

This is why we see that the new and improved Data Landscape mapping tools and solutions offer exponentially better ways of searching, finding, and accessing whatever is needed from the extended enterprise’s data, even if not actually using most of it. PLM processes will never be the same.

The post Data landscapes and the product lifecycle appeared first on Engineering.com.

]]>
5 more project deliverables that drive digital transformation success https://www.engineering.com/5-more-project-deliverables-that-drive-digital-transformation-success/ Wed, 06 Aug 2025 14:04:08 +0000 https://www.engineering.com/?p=141919 Some significant project deliverables that are essential to every digital transformation project plan.

The post 5 more project deliverables that drive digital transformation success appeared first on Engineering.com.

]]>
Digital transformation often changes the way organizations create value. Technology adoption and the growing reliance on digital systems significantly benefit engineers.

From predictive analytics, robotic process automation and artificial intelligence to software that enhances collaboration, organizations are leveraging digital technology to create, capture, and deliver value in new and innovative ways.

Well-defined digital transformation project deliverables lead to project success. Vaguely defined or missing project deliverables create a risk of missed expectations, delays and cost overruns.

Here are the significant project deliverables essential to every digital transformation project plan:

  1. Project charter
  2. Data analytics and visualization strategy
  3. Generative AI strategy
  4. Data conversion strategy
  5. Data profiling strategy
  6. Data integration strategy
  7. Data conversion testing strategy
  8. Data quality strategy
  9. Risk Assessment
  10. Change management plan
  11. Data conversion reports

These deliverables merit more attention for digital transformation projects than routine systems projects. We’ll examine the deliverables six to eleven in this article. To read the first article, click here.

Data integration strategy

Organizations achieve most of the value of digital transformation from new data-based insights and previously impossible process efficiencies. The insights are revealed through sophisticated data analytics and visualization, which rely on data integration. The process efficiencies are achieved in the same way.

The data integration strategy deliverable describes how data from every data source will be integrated with data from other data sources. The primary integration strategies are:

  • Populate a lakehouse or a data warehouse with data from multiple data sources. This strategy persists all the integrated data. Implementation and operation are more expensive but deliver the most integration opportunities and the best query performance.
  • Keep data sources where they are and populate cross-reference tables for key values. This strategy persists only key values to create the illusion of integration. It’s appealing because it’s the easiest and cheapest to implement and operate. It sacrifices query performance.

The value of the data integration strategy lies in informing the software development effort for each of the likely tools, such as extract, transform, and load (ETL), data pipelines, and custom software.

Data conversion testing strategy

It’s impossible for engineers to overestimate the effort required for data conversion testing. Every digital transformation project will involve some risk of data degradation as a result of the conversion process.

The data conversion testing strategy is a deliverable that describes who and how the project will test the data conversion and considers the following:

  • Automation opportunities.
  • Statistical analysis techniques.
  • Mismatches in joins of keys and foreign keys.
  • Strategies for minimizing manual inspection.

The value of the data conversion testing strategy lies in describing the skills and estimating the effort needed to confirm the adequacy of the data conversion.

Data quality strategy

The current data quality in the data sources will almost always disappoint. That reality leads to a data quality strategy deliverable for digital transformation that describes the following:

  • Acceptable data quality level for at least the most critical data elements.
  • Data enhancement actions that will augment available data to more fully meet end-user expectations.
  • Data quality maintenance processes to maintain data quality for the various data sources.

The data quality strategy enables the organization to achieve a consensus on what constitutes sufficient data quality and which data elements are most critical.

Risk assessment

Digital transformation projects experience various risks. This deliverable reminds management that these projects are not simple or easy.

The specific risks that digital transformation projects experience include:

  • Risk of data issues, such as insufficient quality and complex integration. These issues will add cost and schedule to digital transformation projects.
  • The risk of viewing AI as a silver bullet. How can engineers reduce AI model hallucinations? explains how to alleviate this risk.
  • The risk of data analytics and visualization software complexity. This complexity adds to the cost of training and creates a risk of misleading results.
  • The risk of an overly ambitious scope overwhelming the team and budget. This unrealizable scope creates disappointment and reduces commitment to the project.

Identifying and assessing potential risks associated with the transformation will help the project team mitigate risks and minimize their impact if they become a reality. This article How engineers can mitigate AI risks in digital transformation discusses the most common ones.

Change management plan

Digital transformation projects almost always introduce significant changes to business processes. Successful projects define a comprehensive people change management plan that includes:

  • Stating the change management goal, approach and supporting objectives.
  • Engaging project sponsors and stakeholders.
  • Description of the current situation.
  • Description of change management strategies, such as training, change agents and end-user support.
  • Recommended change management roles and resources.
  • Description of the approximate timeline.

This deliverable, when conducted as planned, ensures end-users, such as engineers, adopt new processes and digital tools. Without that adoption, the organization will not realize the benefits of the contemplated project.

Data conversion reports

The data conversion test report contains the results of the data conversion design and testing work for each datastore. It includes:

  • The number of software modules developed to perform data conversion.
  • A summary of the number of rows that couldn’t be converted with a cause.
  • A summary of the data quality improvements required.

The final data conversion report contains the results of the data conversion work by datastore. It includes a summary of the:

  • Number of rows converted.
  • Data quality improvements that were made.
  • Data issues that were not addressed.

These deliverables often frustrate stakeholders because engineers invariably recommend more effort to:

  • Improve historical data quality.
  • Strengthen data quality processes to maintain future data quality.

Paying detailed attention to these digital transformation project deliverables will ensure the success of your project. This list of deliverables is a subset of the overall set of deliverables that all projects typically complete.

The post 5 more project deliverables that drive digital transformation success appeared first on Engineering.com.

]]>
AI and robotics-powered microfactory rebuilds homes lost to the California wildfires https://www.engineering.com/ai-and-robotics-powered-microfactory-rebuilds-homes-lost-to-the-california-wildfires/ Tue, 05 Aug 2025 17:30:58 +0000 https://www.engineering.com/?p=141893 This video shows a collaboration between ABB and Cosmic Buildings to build homes on-site using AI, digital twins and robotics.

The post AI and robotics-powered microfactory rebuilds homes lost to the California wildfires appeared first on Engineering.com.

]]>
ABB Robotics has partnered with construction technology company Cosmic Buildings to help rebuild areas devastated by the 2025 Southern Californian wildfires using AI-powered mobile robotic microfactories.

After the wildfires burned thousands of acres, destroying homes, infrastructure, and natural habitats, this initiative will deploy the microfactory in Pacific Palisades, California, to build modular structures onsite, offering a glimpse into the future of affordable housing construction.

The microfactory collab between ABB and Cosmic Buildings uses simulation, AI and robotics to build homes on-site. (image: screen capture from youtube video.).

Watch the video on youtube.

“Together, Cosmic and ABB Robotics are rewriting the rules of construction and disaster recovery,” said Marc Segura, President of ABB Robotics Division. “By integrating our robots and digital twin technologies into Cosmic’s AI-powered mobile microfactory, we’re enabling real-time, precision automation ideal for remote and disaster-affected sites.”

These microfactories integrate ABB’s IRB 6710 robots and RobotStudio digital twin software with Cosmic’s Robotic Workstation Cell and AI-driven Building Information Model (BIM) – an end-to-end platform that handles design, permitting, procurement, robotic fabrication and assembly.

Housed within an on-site microfactory, these systems fabricate custom structural wall panels with millimeter precision just-in-time for assembly at the construction site.

Cosmic uses ABB’s RobotStudio with its AI BIM allowing the entire build process to be simulated and optimized in a digital environment before deployment. Once on location, Cosmic’s AI and computer vision systems work with the robots, making real-time decisions, detecting issues, and ensuring consistent quality.

These homes are built with non-combustible materials, solar and battery backup systems, and water independence through greywater recycling and renewable water generation. Each home exceeds California’s wildfire and energy efficiency codes. By delivering a turnkey experience from permitting to final construction, Cosmic is redefining what’s possible in emergency recovery.

Cosmic says its mobile microfactory reduces construction time by up to 70% and lowers total building costs by approximately 30% compared to conventional methods. Homes can be delivered in just 12 weeks at $550–$700 per square foot, compared to Los Angeles’ typical $800–$1,000 range.

“Our mobile microfactory is fast enough for disaster recovery, efficient enough to drastically lower costs, and smart enough not to compromise on quality,” said Sasha Jokic, Founder and CEO of Cosmic Buildings. “By integrating robotic automation with AI reasoning and on-site deployment, Cosmic achieves construction speeds three times faster than traditional methods, completing projects in as little as three months.”

The post AI and robotics-powered microfactory rebuilds homes lost to the California wildfires appeared first on Engineering.com.

]]>