Simulation - Engineering.com https://www.engineering.com/category/technology/simulation/ Thu, 04 Sep 2025 13:51:32 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.engineering.com/wp-content/uploads/2025/06/0-Square-Icon-White-on-Purpleb-150x150.png Simulation - Engineering.com https://www.engineering.com/category/technology/simulation/ 32 32 Leveraging simulation-driven design with PTC Creo https://www.engineering.com/leveraging-simulation-driven-design-with-ptc-creo/ Thu, 04 Sep 2025 13:51:30 +0000 https://www.engineering.com/?p=142555 Maintenance Reseller Corp. has sponsored this post. Traditional design workflows often follow a linear path, from concept creation to prototyping and finally performance analysis. However, when the performance analysis exposes a critical flaw late in the sequence, redesign would extend production cycles and increase development costs. To avoid this, many engineers are adopting simulation-driven design, […]

The post Leveraging simulation-driven design with PTC Creo appeared first on Engineering.com.

]]>
Maintenance Reseller Corp. has sponsored this post.

Traditional design workflows often follow a linear path, from concept creation to prototyping and finally performance analysis. However, when the performance analysis exposes a critical flaw late in the sequence, redesign would extend production cycles and increase development costs. To avoid this, many engineers are adopting simulation-driven design, a process that integrates analysis into the workflow from the start and throughout the entire process.

“By moving analysis earlier in the workflow, engineers can design for risk mitigation, quality, and performance,” says Jerry Raether, sales training program manager at PTC. “Instead of creating a part that passes validation, simulation-driven design enables them to refine and optimize until they achieve the best possible design.”

Integrated simulation also reduces development cycles by identifying and correcting potential flaws in real-time, thereby minimizing the back-and-forth between design and analysis teams. This allows engineering companies to bring products to market faster and reduce development costs.

To support this fundamental shift, PTC has integrated analysis tools directly within the Creo 3D CAD environment. Solutions such as Creo Simulation Live and Creo Ansys Simulation eliminate the need for data export or manual setup. The workflow offers a strategic advantage for product development teams, allowing them to refine a concept in real-time based on live performance data.

In this article, we will discuss two key workflows that use simulation-driven design in PTC Creo: one for new product development and another for existing product improvement. Each of these workflows demonstrates how extensions like Generative Topology Optimization (GTO), Creo Simulation Live (CSL), Behavioral Modeling Extension (BMX), and Creo Ansys Simulation enable an integrated design-analysis process.

Workflow 1: Simulation-driven new product design

In a new product design workflow, GTO is used as a concept design tool to rapidly explore and reveal innovative design solutions. GTO is 3D CAD capability that uses AI-driven algorithms and FEA to autonomously create optimal designs solving for user specified requirements and goals, including preferred materials and manufacturing methods.

Figure 1. Generative Topology Optimization results for bracket design. (Image: Maintenance Reseller Corp.)

The workflow starts using Creo multibody design to create the design spaces representing the starting geometry, preserved geometry and optional exclude geometry. Once the geometry is created, the user starts the GTO application and completes the setup by designating the design spaces, defining the physics, and design criteria. The last step is to generate the design by selecting the option to optimize. The user can run as many optimizations as desired to evaluate design alternatives while changing material selection, manufacturing methods and geometric constraints.

“Using generative, I define all these criteria, and then I tell it to optimize and solve for the part. What it’s going to do is come back with a really unique-looking part that will be something an engineer wouldn’t naturally think about designing that way. It speeds time to innovation because it shows a lot of different ways to create that part that an engineer would never think of,” explains Raether.

Jacobs Engineering used PTC Creo Generative Design to design and optimize two components of the NASA spacesuit Portable Life Support System: the O2 tank bracket and the CO2 sensor bracket. The company achieved significant improvements, with a 21 percent and 20 percent weight reduction, respectively.

After GTO has provided the conceptual blueprint, the engineer begins the recreation phase, using traditional Creo tools such as parametric design, surfacing, and freestyle, to create a cleaner, more manufacturable version. The generative output is not a final, manufacturable product, as it may not be suitable for die-casting or NC machining.

Figure 2. Creo Simulation Live stress analysis on a freestyle reconstruction of a topology-optimized bracket. (Image: Maintenance Reseller Corp.)

While the engineer reverse-engineers the GTO output, CSL runs in the background, providing real-time feedback on key performance metrics, such as stress, strain and deformation. The tool is purpose-built for the design engineer to prioritize speed and ease of use over the absolute accuracy required by a specialist analyst.

The value of this trade-off between speed and accuracy is profound. For example, an engineer can iterate rapidly, exploring multiple design alternatives and receiving feedback on the impact of each change. This enables the immediate correction of potential design flaws, reducing the time-consuming redesign cycles associated with traditional workflows.

Engineers also have the option to combine CSL with the BMX to run sensitivity studies to determine which variables most strongly affect stress. They can also run optimization studies, for example, to minimize mass while keeping stress below a target. This automates exploration of the design space. While BMX is not mandatory, engineers can also do this iteratively in CSL. However, it provides a structured way to identify and adjust the variables.

Once the engineer thinks that the design is ready, they can switch from CSL to Creo Ansys Simulation, which functions as a traditional analysis tool. The engineer stops making design edits and focuses purely on simulation accuracy. All boundary conditions (loads, constraints) are transferred into Creo Ansys Simulation for results that are higher fidelity and more precise.

“CSL may not be 100% accurate, but that’s acceptable given the speed at which I’m able to receive analysis feedback to evaluate design choices. While I’m actively designing and making changes to the geometry, the goal is speed and directionally correct feedback. When I’m ready to hand the design over to analysts or manufacturing, that’s when margin of error matters. This is when I switch to Creo Ansys Simulation  and focus on the accuracy of results to validate the final geometry,” Raether adds.

Workflow 2: Simulation-led continuous improvement

The Kaizen workflow supports the concept of continuous improvement. Simulation-driven design can also be equally useful for engineers who want to use an existing part’s geometry as the starting point for the generative process.

The key setup here is to use the “Limit Volume” constraint within GTO. The engineer can specify a target volume, for example, 70 percent of the existing part’s volume. GTO then systematically removes the least critical material. This shows which areas of the existing part contribute most to structural integrity, and what areas are less critical to the overall design. With this information, engineers can make informed decisions to enhance the quality and performance of next generation designs. Throughout the Kaizen process, engineers use CSL to evaluate design choices in pursuit of specified design goals and objectives.

Figure 3. Generative Topology Optimization result with 70 percent Limit Volume constraint applied for a lightweight bracket design. (Image: Maintenance Reseller Corp.)

As part of the SuperTruck project, Volvo used GTO to enhance an existing product, the forward engine mount, which was designed 17 years ago using cast iron and remains in production. The automotive manufacturer achieved a 75 percent weight reduction and an 82 percent decrease in peak stress.

The continuous improvement workflow has a direct impact on the company’s sustainability goals. By identifying and removing unnecessary material from existing parts, companies can reduce material costs and enhance product performance. For heavy-duty vehicles, even a slight weight reduction can lead to substantial gains in fuel efficiency.

Another option is to use CSL throughout the Kaizen workflow. The process starts using CSL to identify and document baseline measures such as stress, strain and deformation. These values serve as checkpoints when evaluating design modifications and changes throughout the Kaizen workflow. As the engineer makes changes to the design, real-time analysis reveals whether the change helped or hurt the design. Using CSL eliminates assumptions and guesswork when exploring design options and alternatives. Engineers can focus on design refinement and optimization to introduce innovation and quality into next generation designs. The final step in the process is to use Creo Ansys Simulation to perform high-fidelity analysis.

Creo adoption through MRC support

Maintenance Reseller Corporation (MRC), a long-standing PTC partner, has always been at the forefront in driving the adoption of PTC Creo. The primary role of MRC is to serve as a sales, renewal, and support channel for PTC software, including the Creo platform.

“Our focus has always been on ensuring that our customers utilize Creo to the most of their advantage, whether that’s by keeping them informed of extensions that can assist their design or optimization process, or by providing updates on each new Creo release or simulation update,” says Esthefany Hung, marketing manager at Maintenance Reseller Corporation.

Conclusion

Simulation-driven design, using PTC Creo and its extensions, provides a technique for developing products with a continuous feedback loop. This yields numerous benefits, including higher product quality, reduced development costs, faster time-to-market, and risk mitigation. In addition to this, simulation-driven workflows tend to deliver innovative solutions that drive the development of next-generation products across various industries.

It is now evident that traditional sequential workflows are no longer sufficient to maintain a competitive edge in today’s demanding market. The adoption of simulation-driven design is the next step for engineering teams.

Ready to take simulation-driven design further?

Download the free whitepaper “How Creo Supports Sustainable Product Development” to see how you can reduce costs, accelerate innovation and build more sustainable products. MRC is here to answer your questions and help you get the most out of Creo — reach out anytime.


The post Leveraging simulation-driven design with PTC Creo appeared first on Engineering.com.

]]>
Siemens NX is getting an AI copilot for CAM https://www.engineering.com/siemens-nx-is-getting-an-ai-copilot-for-cam/ Tue, 02 Sep 2025 19:17:25 +0000 https://www.engineering.com/?p=142526 By the end of the year, NX X Manufacturing users will have access to an AI assistant to help them quickly program parts.

The post Siemens NX is getting an AI copilot for CAM appeared first on Engineering.com.

]]>
Welcome to Engineering Paper. Here’s all the design and simulation software news you missed last week.

… Is what I’d normally say, but software news has been slow this Labor Day pre-kend, weekend, and week-in. Which gives me a perfect opportunity to cover something I’ve been meaning to write about for months, and you’ve been meaning to read about for just as long (though you didn’t know it).

AI is coming to Siemens NX X Manufacturing (again)

Back in July I was in Detroit for Realize Live 2025, Siemens’ annual user conference. I covered the highlights at the time, but I didn’t get around to an interview that I had with Siemens’ Michael Taesch, senior director of product management, and Sashko Kurciski, marketing director for digital manufacturing, about AI for NX CAM.

“We’re trying to see how we fuse AI into our manufacturing process,” Taesch told me.

One result of that fusion is an upcoming AI tool that generates machining strategies for NX users. It’s not available yet and it doesn’t have an official name, but for the sake of conversation Taesch and Kurciski called it the CAM Copilot within NX X Manufacturing. For the sake of this article, I’m going to call it NX CAM Copilot.

With a couple clicks, NX CAM Copilot—not its official name—will generate three possible ways to machine a given feature (and more if needed). The user can take these as a starting point to develop their program. The AI will debut with 2.5 and 3-axis machining strategies and expand from there.

“We’re not here to replace the manufacturing engineer,” Taesch said. “We want to give him multiple processes, and it’s up to him to pick the right one based on his knowledge.”

With a familiar thumbs up/down system, users will be able to rate the suggestions and, in theory, the AI will learn their preferences and improve over time.

“Manufacturing is complex,” Kurciski said. “You can machine a single feature in—I’m not exaggerating—10 different ways with 15 different tools. So you have your own company best practices, and capturing this is very important.”

In that sense, Siemens sees NX CAM Copilot—again, not its official name—as a tool for knowledge capture, one that can help bridge that engineering skills gap that technology vendors keep talking about.

“Company A will program their way. Company B will have a completely different process,” Taesch added. “You cannot have a generic solution. You need to have something that’s tailored, personalized to the customers.”

But the more direct utility of NX CAM Copilot is in its efficiency. Like anything, CAM programming takes time, even for the experts.

“Though I’ve been 10 years in NX CAM, I always make mistakes when I start the programming,” Taesch said. “Here, in a couple clicks, I get a result, and then all I have to do is just fine tune, use my knowledge to adjust a little bit.”

Taesch estimates that NX CAM Copilot can save 80% or more of the time that users would otherwise spend on CAM programming. “I can quickly program a part. Might not be perfect, but if I wanted to quickly quote it, in five minutes I can get something up and running,” he said.

NX CAM Copilot is currently in beta, but Taesch said Siemens plans to release it by the end of 2025. Taesch expects the AI tool to be an add-on to NX X Manufacturing through Siemens’ value-based licensing program, though that isn’t a final decision. (There’s an extant NX X Manufacturing copilot, which is just a chatbot, that’s currently available as a value-based add-on. Perhaps this new tool will be integrated into that.)

NX CAM Copilot, which is a placeholder name, reminds me of Toolpath, which sounds like a placeholder name but isn’t. Toolpath is a web-based platform that uses AI to generate toolpaths (and as I covered last week, it just announced Autodesk as an investor). I brought up the comparison to Taesch and Kurciski, who offered the opinion that Toolpath is more of a black box solution focused on quoting, while NX CAM Copilot provides more tailored manufacturing choices.

“We have another project that is working in parallel where we automatically ingest for quoting,” Taesch added. “It’s going to be tied into NX CAM. And the idea will be to leverage this tool to provide some simple quotes. Not there yet, but we have it in mind.”

So there you have it—an AI engineering tool from Siemens that’s a lot more interesting than Design Copilot NX. I’ll bring you more on NX CAM Copilot, or whatever it will be called, as soon as I learn it (or several months later, whichever comes first).

One last link

Ever wonder what’s inside a 1950s Heathkit vacuum tube oscilloscope? EE World’s Martin Rowe reveals all in his latest teardown—but he needs your help to put it all together.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post Siemens NX is getting an AI copilot for CAM appeared first on Engineering.com.

]]>
Autodesk invests in AI CAM platform Toolpath https://www.engineering.com/autodesk-invests-in-ai-cam-platform-toolpath/ Tue, 26 Aug 2025 18:06:09 +0000 https://www.engineering.com/?p=142367 The Fusion developer joins toolmaker Kennametal and CAM developer ModuleWorks as strategic investors in the cloud manufacturing tool.

The post Autodesk invests in AI CAM platform Toolpath appeared first on Engineering.com.

]]>
Welcome to Engineering Paper and this week’s harvest of design and simulation software news.

Toolpath, the CAM startup using AI to automate toolpath creation, has a new investor, and who it is may surprise you (but it shouldn’t).

“Our new investor is Autodesk,” Al Whatmough, Toolpath CEO, told me. “This closes out all our seed funding.”

The companies didn’t disclose the amount of the investment, but Whatmough said it was part of the strategic investment round that closed in May 2025 and brought Toolpath’s funds to nearly $20 million. Toolmaker Kennametal led that round, which also included CAM kernel developer ModuleWorks.

“There was space in the round for a software leader,” Whatmough said. That Autodesk filled that space was only natural. For one thing, Whatmough was Autodesk’s director of product management for manufacturing until 2021. For another, Toolpath had an existing integration with Autodesk Fusion, which Whatmough praised as “the dominant cloud-based CAM system.”

“Nobody else is anywhere close,” he said. “Whether we had [Autodesk’s] investment or not, Fusion would still be the platform we put our automation on.”

With Autodesk’s investment, Toolpath can take the Fusion integration even further. Autodesk pointed out the potential in a blog post from Stephen Hooper, VP of cloud-based product design and manufacturing solutions.

“[Our investment] marks the start of a strategic partnership, enabling our two companies to integrate closed-loop, fully automated workflows into Autodesk Fusion. Looking ahead, combining Toolpath’s technology with Autodesk’s Manufacturing Data Model would enable Fusion users to automatically analyze manufacturability, plan machine strategies, and send complete programs to Fusion,” Hooper wrote.

A Toolpath toolpath imported into Autodesk Fusion. Note the Toolpath addon in the top right. (Image: Toolpath.)

When I last spoke with him, Whatmough told me that Toolpath planned to support other CAM systems beyond Fusion. I asked him if that’s still the case.

“Our focus is Fusion, just because there’s a core alignment in the current customers,” he said. “Fusion users, by definition, tend to be on the more innovative side. It’s the most modern CAM system. They don’t have a cloud aversion.”

That said, Whatmough emphasized that there’s nothing about Toolpath, either technically or obligatorily, that makes it exclusive to Fusion.

“When we think about CAM integration, it’s like a post processor for us,” he explained. “Today we output the instructions to grab onto the Fusion steering wheel. We’ll make an amazing experience there. Once we do that, we can open up to other CAM systems or directly to the machine.”

One more thing I learned from Whatmough: Toolpath is freely available for hobbyist use through this application process. If you try it out, let me know your thoughts at malba@wtwhmedia.com.

Jon on Onshape

This summer Onshape hit the memorable milestone of 200 updates. The cloud CAD platform is updated like clockwork every three weeks, so if you do the math you’ll find that time is moving a lot faster than it ought to for what I still think of as a fresh new CAD startup.

Thoughts of mortality aside, congratulations to Onshape.

To mark the occasion, I caught up with co-founder Jon Hirschtick to reflect on Onshape’s evolution and where it might go next. You can read all about it in Looking back on 200 releases of Onshape: Q&A with Jon Hirschtick.

Quick hits

  • Coreform has released the latest version of its hex meshing software, Coreform Cubit 2025.8. The update introduces a “sleeker, more modern look” and provides “more robustness, better quality elements, and improved capabilities,” according to Coreform.
  • Electromagnetic simulation software developer Nullspace announced $2.5 million in seed funding that it will use to “expand the engineering team, accelerate product development, and scale go-to-market efforts as we target growing demand across aerospace, defense, quantum computing, and AI-enabled hardware markets,” according to CEO Masha Petrova.
  • CoLab, the Canadian company developing an AI-powered design review tool, commissioned a survey of engineering leaders and discovered, in a stroke of fortuitous validation, that “100% of survey respondents said that AI would speed up design review times.”

One last link

Don’t sit down to read this one: Design World contributor Mark Jones with Finding inspiration in unlikely places.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post Autodesk invests in AI CAM platform Toolpath appeared first on Engineering.com.

]]>
Ansys and Nvidia strike deal for easier Omniverse access in simulation https://www.engineering.com/ansys-and-nvidia-strike-deal-for-easier-omniverse-access-in-simulation/ Tue, 19 Aug 2025 18:00:23 +0000 https://www.engineering.com/?p=142217 Ansys will license Omniverse technology in CFD and autonomous solutions, plus more engineering software news.

The post Ansys and Nvidia strike deal for easier Omniverse access in simulation appeared first on Engineering.com.

]]>
This is Engineering Paper, and here’s the latest design and simulation software news.

Ansys, now a part of Synopsys, has signed an agreement with Nvidia to license, sell, and support Nvidia’s Omniverse technology embedded in Ansys’s simulation software.

According to Ansys, the deal will allow it to deliver easy access to Omniverse technologies and libraries, starting with CFD and autonomous solutions.

“Visualizing fluid dynamics in physically based digital environments enables engineers to analyze complex datasets more intuitively, resulting in smarter, faster design optimization for even the most challenging engineering tasks,” reads Ansys’ announcement.

An Omniverse-powered Ansys Fluent simulation of vehicle aerodynamics. (Image: Ansys.)

Nvidia’s Omniverse is expanding lately. Just a couple weeks ago Nvidia and PTC announced a similar integration for Creo and Windchill, and not long before that Nvidia and Tech Soft 3D teamed up to bolster OpenUSD, the 3D file framework used in Omniverse. Both PTC and Tech Soft 3D also joined the Alliance for OpenUSD (AOUSD), which Nvidia co-founded two years ago. Ansys already joined the AOUSD in yet another collaboration with Nvidia in March 2024.

In other Omniverse news, Nvidia announced new Omniverse libraries and SDKs for robotics development and deployment. You can read more about those here.

Vectorworks 2026 coming this September

Vectorworks has announced details of its upcoming software release, Vectorworks 2026. The design and BIM platform will be available this September with new features that Vectorworks says will optimize several major workflows.

One is that Vectorworks Cloud Services will be integrated directly into the Vectorworks 2026 desktop application, allowing users to leverage cloud computing resources without leaving the program. Vectorworks 2026 also includes a new tool called the File Health Checker palette that “helps keep files in optimal condition and ensures that projects run smoothly and efficiently, especially when integrating files from external sources.” In particular, the cloud will process large Revit file imports in the background.

Another new feature of Vectorworks 2026 is the Sustainability Dashboard, a hub providing real-time insight into compliance targets including embodied carbon calculations, biodiversity net gain, and other metrics.

For more details on Vectorworks 2026, see Vectorworks.com.

Engys releases Helyx 4.4.0

Engys has released Helyx 4.4.0, the latest version of its general purpose CFD software, as well as updated versions of its add-ons Helyx-Coupled, Helyx-Adjoint, and Helyx-Marine.

(Image: Engys.)

Helyx 4.4.0 adds several features including a remote file browser, which allows users to browse directly from the Helyx interface via SSH; enhanced geometry tools, including the ability to import Rhino 3DM files; new meshing capabilities, including new options for isotropic, anisotropic, and cylindrical base meshes; new setup features, including new porous media thermal models; and new post-processing tools, including improved runtime visualization. The latest release also includes solver and performance enhancements, according to Engys.

Kubotek Kosmos releases MBD File Utilities 7.1

Kubotek Kosmos, developer of the direct CAD modeler KeyCreator, has released version 7.1 of its MBD File Utilities software suite.

(Image: Kubotek Kosmos.)

The update extends CAD file support to Dassault Systèmes Catia V5 2025, Autodesk Fusion and Inventor 2026 formats, and Siemens NX 2412 and Parasolid 37.1. View and Convert, two of the four MBD File Utilities, also add support for reading STEP XML 3D assembly structures.

The new version also includes quality of life improvements across the MBD File Utilities suite, such as enhancements to saved views and text attributes that always face the display plane.

One last link

If you’ve ever seen, inhabited, or built a building, don’t miss Marc Ambasna-Jones’ latest article for Engineering.com: How software is redefining sustainable building engineering.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post Ansys and Nvidia strike deal for easier Omniverse access in simulation appeared first on Engineering.com.

]]>
How are engineers using spatial computing? https://www.engineering.com/how-are-engineers-using-spatial-computing/ Thu, 14 Aug 2025 14:30:00 +0000 https://www.engineering.com/?p=141799 AR, VR, and MR are increasingly valuable as tools for visualization, collaboration, presentations and more.

The post How are engineers using spatial computing? appeared first on Engineering.com.

]]>
While you probably already have a device that can access augmented reality (AR), you can’t visit virtual reality without a VR headset. But there are plenty of options to choose from, ranging from consumer-targeted products for a few hundred dollars to enterprise VR headsets that provide better resolution and responsiveness but cost thousands of dollars. Some VR headsets are self-contained computers with internal processors, but others depend on a connection to a GPU-equipped engineering workstation.

Of course, the hardware alone doesn’t get you very far. Software with support for spatial computing is growing by the day. Some CAD programs support VR directly, allowing designers to easily switch to a virtual view of their models. Other software caters to VR design reviews with features for collaboration and markup. Game engine software, sometimes called real-time 3D software, can be used to develop custom AR or VR experiences using existing CAD models.

Though there’s an upfront cost to getting started with spatial computing—both in the price of headsets and software as well as the learning curve for users—for many engineers, the cost is well worth it. VR provides an unparalleled way to visualize and refine a design, and has thus become a part of many engineering workflows.

In this article, we’ll look more closely at the main ways engineers, architects, manufacturers and others are using spatial computing.

Visualization and collaboration

By strapping on a mixed reality (MR) headset or peering through an AR-capable phone or tablet, engineers can see their work as if it were in the real world. This isn’t a gimmick—engineers, after all, design products for the real world, and so visualizing it in place rather than on a computer monitor is an obvious advantage.

Besides the depth and perspective that spatial computing provides to visualization, another important benefit is scale. An engineer designing a small bracket could see that product in life size using a screen, but anything bigger than a computer monitor would be an exercise in imagination (even the most multi of monitor setups can’t fit a car, airplane or space station). With spatial computing, all designers have the opportunity to visualize their designs at scale.

This spatial computing benefit is most readily apparent to designers of, unsurprisingly, spaces. Architects, for instance, can use VR to virtually walk through their building designs, achieving a sense of the space that’s simply impossible through traditional computing. This walkthrough need not be limited to a static showcase, either. Inside VR, designers have the opportunity to make changes on a whim. Don’t like that material finish, that lighting, that façade? A few clicks of your VR controller and you can see it in any number of options. Want to change the time of day, the season of the year, the weather? Go for it. You’re in a virtual world; you control every aspect of it.

Regardless of what you’re designing, spatial computing gives you the ability to visualize it more realistically than ever. But you don’t have to be alone in your virtual world. Another big advantage of spatial computing is that it provides a three-dimensional meeting space. Putting the two together, VR gives engineering teams a way to conduct virtual design reviews with participants from around the globe. In these virtual meeting rooms, participants—who are often represented by virtual avatars—can walk around, talk about, and review 3D models as if they were evaluating a real prototype. Not only does this save the costs of manufacturing and travel, it allows engineering teams to iterate faster and develop better end products.

Factory planning, maintenance and optimization

There are many ways that engineers can use spatial computing for manufacturing. In the same way that an architect can walk through a virtual building, a factory planner can use VR to see a virtual layout of their facility. This realistic and immersive visualization allows them to not just see but experience problems, such as machinery collisions or insufficient spacing, that might otherwise go undetected.

Spatial computing also provides the opportunity to simulate how factory workers interact with the environment, a crucial step for optimizing ergonomics. Even if a process seems fine on a computer monitor, stepping into a VR version of it would make it readily apparent that workers would have to, say, bend down too much to grab the next component. It’s a fix that’s all the simpler for catching it in advance.

AR and VR can both improve the process of equipment maintenance as well. This might take the form of an augmented video call between a worker and an off-site maintenance technician, who could annotate a piece of equipment from afar while the worker sees the notes, in place on the equipment, through an AR-enabled tablet. The technician might have learned about the equipment from a VR manual, virtually taking it apart and putting it back together.

Another popular use case for VR is for operator training, as it provides an unparalleled platform to simulate different scenarios. This could be used to train workers on their core job and beyond. For example, a VR simulation of a factory fire or hazardous spill could get every employee viscerally comfortable with emergency procedures.

Presentations and marketing

The immersive experience of spatial computing is a natural fit for showing off your product, whether internally or externally. For the same reasons that engineers and architects enjoy VR for design visualization and collaboration, the technology is a great option for presenting product concepts to others within an organization. Sketches and renders are nice, but they don’t beat life-like representation in a real environment.

Similarly, consumers increasingly appreciate—in some categories, even expect—spatial computing models that they can try on at home, so to speak. This is particularly common for products with lots of aesthetic variation, such as furniture. It may be difficult to pick out the perfect sectional in a brightly-lit showroom, but if you were able to compare options in your own home, the choice would be much easier. All you need is an AR-capable phone—and for the manufacturer to give you an AR option on their website. Combined with configuration tools, spatial computing gives potential customers the most convincing and personalized sales pitch you could imagine.

The post How are engineers using spatial computing? appeared first on Engineering.com.

]]>
U.S. Coast Guard blames engineering failures for Titan sub tragedy https://www.engineering.com/u-s-coast-guard-blames-engineering-failures-for-titan-sub-tragedy/ Tue, 12 Aug 2025 18:40:18 +0000 https://www.engineering.com/?p=142055 New report says the implosion that cost five lives on a June 2023 dive to the Titanic was a “preventable tragedy.”

The post U.S. Coast Guard blames engineering failures for Titan sub tragedy appeared first on Engineering.com.

]]>
Welcome to Engineering Paper. Today’s top story isn’t about design or simulation software, but the importance of proper design, rigorous analysis, and engineering responsibility.

The U.S. Coast Guard Marine Board of Investigation (MBI) last week released its Report of Investigation (ROI) on the OceanGate Titan, the submersible that imploded in a June 2023 dive to explore the wreckage of the Titanic. The five people on board were killed instantly.

After two years of investigating, the MBI concluded that the tragedy could have been prevented.

“The U.S. Coast Guard’s Marine Board of Investigation into the fatal incident found that OceanGate’s failure to follow established engineering protocols for safety, testing, and maintenance of their submersible, was the primary causal factor,” reads the ROI’s executive summary.

The conclusion was unsurprising. In the days and weeks after the implosion, engineers around the world quickly discovered a rash of problems with the submersible’s design. Here’s a selection of some of the videos and articles Engineering.com published at the time:

Simulation Reveals Exactly How Titan Submersible Imploded

Was the OceanGate Sub Implosion an Engineering Failure?

The Titan Tragedy—A Deep Dive Into Carbon Fiber, Used for the First Time in a Submersible

The MBI’s report confirms that the Titan suffered from blatant design problems. The ultimate failure point that led to the submersible’s implosion was “either the adhesive joint between the Titan’s forward dome and the titanium segment or the carbon fiber hull near the forward end of the Titan,” according to the report.

The report condemns OceanGate, the company that built and operated Titan, for cutting corners in design, lacking standard engineering procedures, failing to investigate clear problems, dismissing internal concerns, and intentionally skirting regulations, among other issues.

I’m sure many of you were struck by the Titan story when it unfolded in 2023. I was too. In fact, I had a distant connection to it. In 2020 I interviewed OceanGate’s then director of engineering for a series of articles about how the company was using Onshape for CAD. It should go without saying that we didn’t discuss the many engineering and organizational problems that he later reported to the MBI (you can read about them in section 4.6.10.5 of the ROI). It should also go without saying that OceanGate’s CAD system was not to blame for the tragedy, though you can understand why Onshape removed the OceanGate case study from their website (it’s archived here).

Engineers bear the burden of their professional decisions. The Titan joins a dark list of engineering failures that weigh that burden in lives: Paul-Henri Nargeolet, French deep sea explorer and Titanic expert; Pakistani businessman Shahzada Dawood and his 19-year-old son, Suleman Dawood; British businessman and explorer Hamish Harding; and Stockton Rush, OceanGate’s CEO, who was piloting the Titan on its final voyage.

*****

And now, some software news:

Quick hits

  • Forrester has named Siemens and Aras as the leaders of its Forrester Wave: Product Lifecycle Management Platforms For Discrete Manufacturers, Q3 2025 report. Siemens Teamcenter X ranked as the customer favorite PLM platform with the strongest strategy, according to Forrester, and Aras Innovator edged ahead as the strongest offering in the pool of eight PLM providers.
  • In other Siemens news, Siemens Digital Industries Software has launched the new PartQuest Design Enablement portfolio for electronic component manufacturers. “By unifying design content, supply intelligence, collaboration and real-time analytics, PartQuest Design Enablement gives manufacturers… a powerful way to drive both operational efficiency and customer satisfaction at scale,” said AJ Incorvaia, senior vice president at Siemens Digital Industries Software, in the press release. There’s more info on the new solution, including a video overview, in this Siemens blog.
  • Comsol announced the keynote speakers for its upcoming Comsol Conference 2025 Boston, taking place October 8 – 10. The speaker lineup will include Kyle Koppenhoefer of AltaSim Technologies, Zhen (Jim) Sun of Amazon Lab126, Soon Kiat Lau of Conagra Brands, Juejun (JJ) Hu of MIT, Hannah Alpert of NASA Ames Research Center, and Hanna Paddubrouskaya of Tokyo Electron US (TEL).
  • Jetcam has updated its free CAD Viewer tool. Version 4 of the Windows-based 2D viewer has new features including support for the Windows dark or light theme, 32-bit and 64-bit versions, new layer visibility controls, and more.
  • Bentley Systems has announced the finalists for its 2025 Going Digital Awards, an annual program meant to honor infrastructure around the globe. From a pool of nearly 250 nominations, according to Bentley, an independent panel selected 37 finalists across 12 categories. The finalists will present their projects at Bentley’s annual Year in Infrastructure conference, taking place October 15 – 16 in Amsterdam, and the winners will be announced there. You can browse the full list of finalists here.

One last link

As American as Apple phones? Read about the iPhone maker’s domestic investment in Apple announces $100B American Manufacturing Program by Engineering.com senior editor Michael Ouellette.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post U.S. Coast Guard blames engineering failures for Titan sub tragedy appeared first on Engineering.com.

]]>
Nvidia Omniverse coming to PTC Creo and Windchill https://www.engineering.com/nvidia-omniverse-coming-to-ptc-creo-and-windchill/ Tue, 05 Aug 2025 15:34:07 +0000 https://www.engineering.com/?p=141889 Plus PTC pledged itself to the Alliance for OpenUSD, and more design and simulation software news.

The post Nvidia Omniverse coming to PTC Creo and Windchill appeared first on Engineering.com.

]]>
This is Engineering Paper, and here’s the latest design and simulation software news.

PTC has expanded its partnership with Nvidia. The Boston-based developer, which not long ago was rumored to be up for sale, says it will integrate Nvidia Omniverse technologies into Creo and Windchill.

“By connecting Windchill with Omniverse’s real-time, photorealistic simulation development platform, teams will be able to visualize and interact with the most current Creo design data in a shared, immersive environment,” reads PTC’s press release.

PTC has also joined the Alliance for OpenUSD (AOUSD), a group working to advance the Pixar-created OpenUSD file framework used in Nvidia Omniverse. Nvidia was one of the five founding members of the AOUSD alongside Pixar, Adobe, Apple, and Autodesk. In June, engineering software developer Tech Soft 3D also announced a collaboration with Nvidia and joined the AOUSD.

“By deepening our collaboration with Nvidia and joining the Alliance for OpenUSD, we’re giving our customers the ability to incorporate design and configuration data in a real-time, immersive simulation environment,” said Neil Barua, president and CEO of PTC, in the press release. “The integration of Omniverse technologies within Creo and Windchill will enable teams to accelerate development, improve product quality, and collaborate more effectively across the entire product lifecycle.”

Desktop Metal files for Chapter 11

The story of 3D printing company Desktop Metal has reached Chapter 11.

“Barely more than two years after Stratasys made a $1.8B bid for it and just a few weeks after Nano Dimension acquired it for a fraction of that price, Desktop Metal has filed for bankruptcy protection under Chapter 11 of the U.S. Bankruptcy Code,” wrote Engineering.com 3D printing editor Ian Wright in his coverage of the news.

“After much speculation about the fate of the beleaguered metal AM company… this looks like the end of what was once the darling of investors and 3D printing enthusiasts alike,” Ian wrote.

For more details, read the full article on Engineering.com: Desktop Metal files for Chapter 11.

ITC goes dark with IntelliCAD 14.0

The IntelliCAD Technology Consortium announced the release of IntelliCAD 14.0, the latest version of the member-funded CAD development platform.

IntelliCAD 14.0 introduces a dark mode, which in my opinion is an accessibility setting that belongs in every software package (I’m baffled by extremely popular applications that still lack the option—I’m looking at you, Google Docs).

“While dark is now the default, you can also choose from light or gray themes,” according to ITC’s video overview of IntelliCAD 14.0.

Screenshot of IntelliCAD 14.0. (Image: IntelliCAD Technology Consortium.)

The new release also adds faster performance for common functions including copy, break, move, and union, as well as detachable drawing windows, support for Autodesk Revit 2025 files, API enhancements, and more.

“IntelliCAD 14.0 reflects our commitment to listening to real-world feedback from our members and delivering the tools they need most,” said Shawn Lindsay, president of the IntelliCAD Technology Consortium, in the release announcement. “We remain focused on providing an open, dependable platform that developers can build on—and on offering a powerful alternative in the CAD software market.”

One last link

Engineering.com executive editor Jim Anderton’s latest episode of End of the Line discusses the rapidly changing technology of warfare: The war in Ukraine: The end of armor as we know it.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post Nvidia Omniverse coming to PTC Creo and Windchill appeared first on Engineering.com.

]]>
How to choose the right spatial computing hardware for engineering https://www.engineering.com/how-to-choose-the-right-spatial-computing-hardware-for-engineering/ Thu, 31 Jul 2025 18:23:35 +0000 https://www.engineering.com/?p=141794 From HMDs to omnidirectional treadmills to 3D displays and beyond, there are a lot of options to consider. This guide will help.

The post How to choose the right spatial computing hardware for engineering appeared first on Engineering.com.

]]>
Spatial computing often requires specialized hardware. Though commonplace phones and tablets can power some augmented reality (AR) experiences, dedicated and expensive headsets—and the workstations that power them—are often required for enterprise applications. Other spatial computing hardware, such as 3D monitors and omnidirectional treadmills, shows that in spatial computing, one size does not fit all.

In this article, we’ll explore these hardware options with a focus on engineering applications.

Head-mounted displays and beyond

The stereotypical image of a head-mounted display (HMD) is a bulky, protruding visor that blocks out the external world, fastened to one’s head with straps and clasps and a cable tethering it all to a power source. While the industry continues to strive for smaller, lighter and more comfortable headsets, the most functional (and expensive) HMDs are all a bit unwieldy, stuffed as they are with electronics and sensors.

For now, these bulky HMDs are a necessity for virtual reality (VR) and mixed reality (MR) applications. But the benefits can outweigh the discomfort, since these HMDs are the most effective at producing convincing VR experiences. The best HMDs on the market can provide engineers with photorealistic and stable virtual images that are truly effective representations of real objects. With the proper hardware and know-how, an automotive designer could produce an MR experience of a virtual car that reflects the real world around it—the glints from the actual windows, the distorted reflection of the design studio—an effect nearly as convincing as if the car were literally there.

While VR and MR require the user to don an opaque HMD, dedicated AR headsets are transparent, projecting virtual elements on top of the real world. This is less claustrophobic than a VR or MR headset, but also less convincing. The images on AR displays are often constrained to a narrow field of view and can appear ghostly against the real environment.

But HMDs aren’t required for spatial computing, and depending on their needs, a user could make do with hardware they already own. AR can be achieved with standard smartphones and tablets, which use a combination of cameras and lidar sensors to provide a window into an augmented world. This flexibility makes AR more accessible than VR, especially in the field or on the factory floor.

Not all hardware makes for an equal AR experience. Using a phone or tablet for AR may be convenient, but the quality, spatial stability and responsiveness of the AR models won’t be as good as with a high-end headset. For engineers hoping to use AR to evaluate their designs, an HMD will provide the best experience. AR through a phone or tablet simply can’t provide the same level of immersion as an HMD.

Ultimately, the best hardware for your needs will be dependent on your application, your software and your budget. As more devices emerge to support spatial computing, more engineering applications of this technology will be close behind.

Spatial computing workstations

Now that we’ve discussed HMDs and their alternatives, it’s important to note a second crucial hardware element: workstations.

There are two types of HMDs: those that are self-contained and those that require external control. Self-contained HMDs are computers in a specialized form. They have their own processors and power supplies built in, meaning all a user has to do is put on the headset and get to work. Self-contained HMDs are convenient, but like any mobile device, their performance is limited by their form factor.

The second type of HMD must be connected to an external computer, which provides power and computation. Though this wired connection can prohibit movement and increase discomfort, it also allows users to take advantage of the best available hardware. Engineers often have powerful workstations equipped with one or more graphics cards. In the same way that this hardware accelerates photorealistic product renders, it processes and sends high-quality VR images to the HMD. Don’t forget that every HMD must display two images at once—one for the left eye and a different one for the right—so displaying a 4K VR scene would in effect require 8K resolution.

Workstations are the best bet for such high computational demands. Fortunately, most engineers already have access to one. For those that don’t, high-performance spatial computing will require an additional investment in a workstation with plenty of memory, a relatively good CPU and one or more discrete graphics cards. HMD manufacturers generally provide a list of minimum specs or recommended computers that can point you to the right hardware.

Related: The ultimate guide to buying an engineering computer

There is a way to combine high-performance spatial computing with the convenience of a standalone headset: wireless streaming. As with other domains, this isn’t a perfect solution. For one thing, the HMD still needs a power source, so it will either have a limited battery life or be tethered to power anyways. For another, streaming comes with latency. If there’s a slight delay between when a user turns their head and when a VR scene responds, the illusion is compromised. Some users may even experience motion sickness. Streaming is only an option for users who can ensure a consistent and low-latency connection.

Other spatial computing devices

Phones, tablets and HMDs are the most popular and important types of spatial computing hardware. But there are other devices that enable or facilitate spatial computing, and this section will briefly discuss them.

Handheld controllers often accompany HMDs, aiming to give users more precise control over movement and selection in the virtual world. Some HMDs use a single controller, while others come with two. With two controllers, each one can be specialized—for example, one could have a joystick for moving, while another has a trackpad for interacting with on-screen elements. Some HMDs eschew controllers altogether, using eye tracking and/or hand gestures to manipulate the user interface.

To maintain the illusion of virtual space, HMDs must be able to track the user’s position in real space and map it appropriately. Some HMDs perform this tracking on board, but others require external sensors to be set up around the user. In either case, users must limit their own movement to avoid literally crashing into reality. One way of avoiding this problem is an omnidirectional treadmill, a platform that allows users to walk as far as they want in any direction while remaining in place. Omnidirectional treadmills are specialized pieces of equipment that, for now, are more arcade novelty than industrial necessity. Still, they demonstrate that by systematically eliminating the constraints of the real world, spatial computing can more fully realize the potential of the virtual.

3D displays are an emerging (more accurately, re-emerging) type of monitor that simulates depth and perspective. The working principle is the same as any spatial computing display: show the left eye one image and the right eye another. The biggest advantage of modern 3D screens is that they accomplish this without the need for glasses. Using eye tracking cameras, lenticular lenses and appropriate rendering algorithms, these so-called autostereoscopic displays provide a convincing 3D effect to the bare eyes. This allows engineers to quickly and easily gain a 3D perspective on CAD models, for example, which on traditional monitors are mere 2D representations. The benefits of this shift in perspective have been questioned, however, especially in light of the high cost of 3D displays. Though 3D displays are technically a form of spatial computing, they’re a limited form that is yet to gain widespread traction among engineers.

The post How to choose the right spatial computing hardware for engineering appeared first on Engineering.com.

]]>
AI model promises “almost-instantaneous aerodynamic predictions” https://www.engineering.com/ai-model-promises-almost-instantaneous-aerodynamic-predictions/ Tue, 29 Jul 2025 17:54:45 +0000 https://www.engineering.com/?p=141720 SHIFT-Wing, Luminary Cloud’s second physics AI model, gets off the ground. That and more software news from Ansys, CoLab, and beyond.

The post AI model promises “almost-instantaneous aerodynamic predictions” appeared first on Engineering.com.

]]>
You’re reading Engineering Paper, and here’s the latest design and simulation software news.

Simulation provider Luminary Cloud has released SHIFT-Wing, a pre-trained AI model for transonic wing design. This follows SHIFT-SUV, a model for SUV aerodynamics that Luminary released in April.

Luminary says that SHIFT-Wing’s real-time AI inference will allow aerospace engineers to explore more wing designs earlier in the conceptual design phase.

SHIFT-Wing was trained on a dataset of thousands of parametrically modified variants of the NASA Common Research Model wing-body geometry, according to Luminary Cloud’s blog post, which was built and modified in Onshape and simulated with Luminary’s CFD platform.

Like SHIFT-SUV, SHIFT-Wing was developed with Nvidia’s PhysicsNeMo AI model training framework. Luminary consulted with Honda to develop the SUV model, and SHIFT-Wing also had a helping hand from inside the industry: Otto Aviation.

Comparison of a CFD analysis to SHIFT-Wing inference. (Image: Luminary Cloud.)

“SHIFT-Wing unlocks AI-driven innovation for the next generation of aircraft by allowing aerospace companies to feasibly explore more designs than previously possible and to use the almost-instantaneous aerodynamic predictions to introduce interactions with other elements of the design, including structural analysis and actuator and control system design,” said Juan Alonso, CTO and co-founder of Luminary Cloud, in the company’s press release.

Luminary Cloud is betting big on physics AI models. Alonso explained why in a recent webinar, Physics AI: The Engineering Revolution You Need to Be Prepared For. It was part of Engineering.com’s Design and Simulation Week earlier this month, which is now available on demand. If you have any interest in AI simulation, it’s worth a watch.

A closeup on CoLab AutoReview

Last week I wrote that Canadian developer CoLab released AutoReview, an AI tool that reviews drawings and 3D models to catch design problems.

I’ve since spoken with CoLab co-founder and CEO Adam Keating, who told me more about AutoReview and his company’s vision for AI. It’s a vision that clearly resonates with engineers.

“The biggest problem right now is way more people wanted it than we were planning,” Keating said of AutoReview. The waitlist for early access already has 27,000 names.

CoLab AutoReview highlights the inconsistent wall thickness on this camera housing model. (Image: CoLab.)

For more details on AutoReview, check out my latest article for Engineering.com: New AI tool catches design errors (and helps engineers learn from them).

Ansys 2025 R2 adds AI copilot + more

Ansys, fresh from its $35B acquisition by Synopsys, has announced the 2025 R2 version of its simulation platform.

The headline update? I’ll give you three guesses, and if still you can’t get it, try asking Ansys’s new AI product support chatbot, Ansys Engineering Copilot.

It’s always a product support chatbot.

“Ansys Engineering Copilot [is] a new multifunctional virtual AI assistant integrated into Ansys… [that] equips users with one-click access to over 50 years of simulation expertise, learning resources, and AI-powered support from within the Ansys user interface,” reads the Ansys press release.

That’s not it for AI, though. Ansys Engineering Copilot is merely one of three pillars of AI advancements in Ansys 2025 R2, according to an Ansys video on the release (see below). The other pillars are Ansys SimAI, the company’s tool for developing custom AI models, and something called Ansys AI+, which according to the video “supercharges existing Ansys solvers with faster runtimes, higher accuracy, and smarter optimization.” I can’t believe we’re already at the AI+ stage of AI—time sure does fly.

For more details on the 2025 R2 release, see Ansys’s release highlights page.

Quick hits

  • Altair, fresh from its $10B acquisition by Siemens, has released an eBook detailing 100 AI-Powered Engineering Use Cases. Number 97 will shock you!
  • Stratasys released GrabCAD Print Pro 2025, an update to its 3D printing preparation software. The new release includes an integrated fixture design application called Fixturemate that Stratasys says can reduce fixture design time by up to 80%.
  • The Open Group has formed the Open Digital Transformation Forum, a network of “business leaders, technology professionals, industry experts, and academics who are committed to navigating the complexities of Digital Transformation.” I was once involved in an unrelated Digital Transformation Forum, and one of the problems with that initiative was the suggestive acronym (this new one wisely uses ODXF).

One last link

Learn why gallium nitride is the next big thing in semiconductors in Engineering.com’s latest podcast with host Jim Anderton and EEworldonline.com editor-in-chief Aimee Kalnoskas.

Got news, tips, comments, or complaints? Send them my way: malba@wtwhmedia.com.

The post AI model promises “almost-instantaneous aerodynamic predictions” appeared first on Engineering.com.

]]>
How is hardware-in-the-loop (HIL) testing used in automotive engineering? https://www.engineering.com/how-is-hil-testing-used-in-automotive-engineering/ Sun, 27 Jul 2025 17:24:06 +0000 https://www.engineering.com/?p=141653 HIL testing enables faster, safer and more cost-effective development through real-time simulation of complex vehicle systems.

The post How is hardware-in-the-loop (HIL) testing used in automotive engineering? appeared first on Engineering.com.

]]>
Numerous industries have been using hardware-in-the-loop (HIL) testing to simulate real-world conditions by replacing a physical system with a virtual representation of that system. By connecting a controller to a system simulating the operation of a physical product, manufacturing and engineering teams can test products in a controlled environment before deploying them. The automotive industry has been a leading adopter of HIL testing.

Why is HIL testing important in automotive engineering?

In recent years, the number of electronic control units (ECUs) in automobiles has grown dramatically. These ECUs have replaced many mechanical components and handle various functions and input/output, making them prime candidates for HIL testing. By simulating the operation of ECU-guided automotive products, teams can test virtual representations of products instead of physically testing finished prototypes.

HIL testing offers numerous benefits to automotive engineers. By testing virtual models of products, teams can save significant time and expense. HIL testing can also identify potential flaws earlier in the workflow, when the flaws can be fixed more affordably and efficiently. And, with virtual models replacing physical models, HIL testing can consider numerous scenarios without the time and expense required for physical tests. It also offers safety benefits, enabling teams to simulate conditions without exposing humans to dangerous situations.

Examples of automotive HIL testing

A wide variety of automotive products and systems can benefit from HIL testing. While some systems include interaction with the user and others are controlled automatically by sensors and computer-aided devices. If over-the-air (OTA) updates are provided to update software, HIL testing can also incorporate these updates and modify testing accordingly. Here are some examples:

Engines: One or more engine ECUs govern engine operation, converting sensor measurements into actions such as adjusting air intake. HIL testing simulates engine operation and interaction with real I/O devices such as acceleration pedals. For internal combustion engines, this might include testing controllers that handle fuel consumption and emission control. For electric vehicles (EVs), HIL testing might simulate a motor-generator, battery and connections to direct-drive transmission.

Powertrains: Transmission systems, power electronics and battery management systems can be tested with real-time simulations. HIL testing simulates the operation of various components such as converters, relays, onboard power system components and charging systems. With the growth of EVs, battery management is a key design consideration.

Chassis and vehicle dynamics: This can include steering, braking, suspension and traction control systems. For example, a steering system can be tested using HIL techniques to model  vehicle handling behavior based on digital input that simulates steering wheel actions. The testing can aid development of steering controls and electro-mechanical actuators. Similarly, brake hydraulics can be simulated with HIL testing to model braking maneuvers and vehicle responses.

Advanced driver assistance systems (ADAS): Cameras, sensors and other devices can help alert drivers of potential collisions with other vehicles, detect pedestrian and roadside obstacles, and in the case of autonomous vehicles (AVs), take over driving under certain conditions. HIL testing can simulate real-life situations in developing and validating these systems.

Interior features: Seating, lighting, heating, air conditioning and other systems now rely heavily on ECUs. HIL testing can be used to test scenarios such as power-seat operation and climate-control systems, which may include heat exchangers, compressors, sensors, actuators, ductwork and other devices. HIL testing can simulate the functionality of various components and ECUs and their connection to the centralized ECU.

Entertainment and information systems: These systems connect the driver and passengers to a wide variety of information sources, including mobile devices, conventional and satellite radio, cloud resources, dashboard display systems and diagnostics. They also play a key role in coordinating OTA updates. HIL simulation can test various instrument clusters, displays and warning systems, with a real-time target machine running a virtual vehicle and direct interfaces using analog and digital signals or standard communication protocols.

HIL testing is finding many applications related to electric vehicles (EVs), such as battery management. (Image source: Adobe Stock.)

What other types of testing are used with embedded systems?

In addition to HIL testing, various other types of testing are used in embedded automotive systems:

  • Model-in-the-loop (MIL) testing simulates both the controller and the physical systems with virtual models instead of physical hardware. MIL testing is often used in early development stages to verify design assumptions and control algorithms.
  • Software-in-the-loop (SIL) testing runs the software in a simulated environment to verify that the control algorithms are operating correctly, providing a bridge between model simulations and real-world applications.
  • Processor-in-the-loop (PIL) testing executes the control algorithms on the actual processor or other device connected to the simulated environment. This technique checks for potential issues related to code generation, execution timing and processor-specific behavior, confirming that the software will perform as intended.

Several variations of HIL have also been developed. Power hardware-in-the-loop (P-HIL) testing introduces power amplifiers or other power equipment to convert low-voltage signals from a real-time system into the higher voltages of the emulated device. This approach enables teams to test power components of the control system in a framework similar to actual operating conditions.

Virtual HIL (vHIL) testing enables creation and execution of tests before the actual ECU hardware is available. With the vHIL approach, testing can begin earlier and be automated to guide subsequent testing.

What standards are applicable to HIL testing?

A variety of standards and guidelines apply to automotive HIL testing. The International Organization for Standardization (ISO) established ISO 26262 (road vehicles functional safety) as an international standard for functional safety of electrical and/or electronic systems. ISO 26262 includes requirements for HIL development processes and documentation of these processes, as well as qualification and validation. Other ISO standards applicable to automotive testing include ISO 21448 (road vehicles — safety of the intended functionality or SOTIF) and ISO 21434 (road vehicles cybersecurity engineering), which was developed in conjunction with SAE International.

The Association for Standardization for Automation and Measuring Systems published ASAM XIL, an API standard that covers multiple types of in-the-loop testing, HIL, MIL and SIL. The standard provides guidance on communication between test automation tools and test benches, facilitating the integration of HIL technology from different vendors.    

Various communication protocols apply to HIL testing. Protocols such as Ethernet, controller area network (CAN) and local interconnect network (LIN) define connections between the real-time test system and the actual embedded controller. Actuator interfaces that connect the test equipment to the simulated system use these communication protocols to accurately capture hardware responses and feed them back into the simulation for real-time analysis.

Future applications of automotive HIL testing will likely incorporate artificial intelligence (AI), automation and digital twins. AI and automation can help improve efficiency, simulating driver behavior, traffic conditions, and other complex situations. Digital twins — virtual representations of physical systems — can enhance the realism of HIL simulations, allowing for real-time synchronization between virtual and physical components.

HIL testing is also likely to become more modular and scalable, allowing for testing of different vehicle types. This enables testing to be adapted to a wide range of vehicles, ranging from compact cars to luxury sedans and commercial vehicles.

The post How is hardware-in-the-loop (HIL) testing used in automotive engineering? appeared first on Engineering.com.

]]>