The last month, it seems like in my ecosystem, people are incredibly focused on “THE BOM” combined with AI agents working around the clock. One of the reasons I have this impression, of course, is my irregular participation in the Future of PLM panel discussions, moderated and organized by Michael Finocharrio.
Yesterday, the continuously growing Future of PLM team held another interesting discussion: “A BOMversation”. You can watch the replay and the comments during the debate here: To BOM or Not to BOM: A BOMversation
On the other hand, there is Prof. Jorg Fischer with his provocative post: 📌 2026 – The year we have to unlearn BOMs! –
Sounds like a dramatic opening, but when you read his post and my post below, you will learn that there is a lot of (conceptual) alignment.
Then there are PLM vendors who announce “next-generation BOM management,” startup companies that promise AI-powered configuration engines, and consultants who explain how the BOM has become the foundation of digital transformation. (I do not think so)
And as Anup Karumanchi states, BOMs can be the reason if production keeps breaking.
I must confess that I also have a strong opinion about the various BOMs and their application in multiple industries.
My 2019 blog post: The importance of EBOM and MBOM is in the top 3 of most-read posts. BOM discussions, single BOM, multiview BOM, etc., always attract an audience.
I continuously observe a big challenge at the companies I am working with – the difference between theory and reality.
If the BOM is so important, why do so many organizations still struggle to make it work across engineering, manufacturing, supply chain, and service?
The answer is two-fold: LEGACY DATA, PROCESSES and PEOPLE, and the understanding that the BOM we are using today was designed for a different industrial reality.
Let me share my experiences, which take longer to digest than an entertaining webinar.
Some BOM history and theory
Historically, the BOM was a production artifact. It described what was needed to build something and in what quantities. When PLM systems emerged, the 3D CAD model structure became the authoritative structure representing product definition, driven mainly by the PLM vendors with dominant 3D CAD tools in their portfolio.
As the various disciplines in the company were not integrated at all, the BOM structure derived from the 3D CAD model was often a simplified way to prepare a BOM for ERP. The transfer to ERP was done manually (retype the structure in ERP), advanced (using Excel export and import with some manipulation) or advanced through an “intelligent” interface.

There are still a lot of companies working this way, probably because, due to the siloed organization, there is no one owning or driving a smooth flow of information in the company.
The need for an eBOM and mBOM
When companies become more mature and start to implement a PLM system, they will discover, depending on their core business processes, that it makes sense to split the BOM concept into a specification structure, the eBOM and a manufacturing structure for ERP, the mBOM.
The advantage of this split is that the engineering specification can remain stable over time, as it provides a functional view of the product with its functional assemblies and part definitions.
This definition needs to be resolved and adapted for a specific plant with its local suppliers and resources. PLM systems often support the transformation from the eBOM to a proposed mBOM, and if done more completely with a Bill of Process.

The advantages of a split in an eBOM and an mBOM are:
- Reduced the number of engineering changes when supplier parts change
- Centralized control of all product IP related to its specifications (eBOM/3DCAD)
- Efficient support for modularity, as each module has its own lifecycle and can be used in multiple products.
Implementing an eBOM/mBOM concept
The theory, the methodology and implementation are clear, and you can ask ChatGPT and others to support you in this step.
However where ChatGPT or service providers often fail is to motivate a company to move to this next steps, as either their legacy data and tools are incompatible (WHY CHANGE?), the future is not understood and feels risky (I DON’T LIKE IT) or for political career reasons a change is blocked (I DON’T LIKE YOU or the HIPPO says differently)
Extending to the sBOM
When you sell products in large volumes, like cars or consumer products, companies have discovered and organized a well-established service business, as the margins are high here.
Companies that sell almost unique solutions for customers, batch-size 1 or small series, are also discovering or asked by their customers to come up with service plans and related pricing.
The challenge for these companies is that there is a lot of guesswork to be done, as the service business was not planned in their legacy business. A quick and dirty solution was to use the mBOM in ERP as the source of information. However, the ERP system usually does not provide any context information, such as where the part is located and what potential other parts need to be replaced—a challenging job for service engineers.
A less quick and still a little dirty solution was create a new structure in the PLM system, which provided the service kits and service parts for the defined product, preferably done based on the eBOM, if an eBOM exists.
The ideal solution would be that service engineers are working in parallel and in the same environment as the other engineers, but this requires an organisational change.
The organization often becomes the blocker.
As long as the PLM system is considered a tool for engineering, advanced extensions to other disciplines will be hard to achieve.
A linear organization aligned with a traditional release process will have difficulties changing to work with a common PLM backbone that satisfies engineering, manufacturing engineering and service engineering at the same time.
Now, the term PLM becomes Product Lifecycle MANAGEMENT and this brings us to the core issue: the BOM is too often reduced to a parts list without understanding the broader context of the product, needed for service or operation support where artifacts can be hardware and software in a system.

What is really needed is an extended data model with at least a logical product structure that can represent multiple views of the same product: engineering intent, manufacturing reality, service configuration, software composition, and operational context. These views should not be separate silos connected by fragile integrations. They should be derived from a shared, consistent digital infrastructure – this is what I extract from Prof. Jorg Fischer’s post, be it that he comes with a strong SAP background and focus on CTO+
Most companies are still organized around linear processes with a focus on mechanical products: engineering hands over to manufacturing, manufacturing hands over to service, and feedback loops are weak or nonexistent.
Changing the BOM without changing the organization is like repainting a house with structural cracks. It may look better, but the underlying issues remain.
Listen to this snippet from the BOMversation where Patrick Hilberg touches this point too.
With this approach, the digital thread becomes more than a buzzword. A digital thread must provide digital continuity, which means that changes propagate across domains, that data is contextualized, and that lifecycle feedback flows back into product development. Without this continuity, digital twins concepts remain isolated models rather than living representations of real products.
However, the most significant barrier is not technical. It is organizational. There is an interesting parallel with how we address climate change and are willing to take action against it.
For decades, we have known what needs to change. The science is precise. The solutions exist. Yet progress is slow because transformation requires breaking established habits, business models, and power structures.

Digital transformation in product lifecycle management follows a similar pattern. Everyone agrees that data silos are a problem. Everyone wants “end-to-end visibility.” Yet few organizations are willing to rethink ownership of product data and processes fundamentally.
So what does the future BOM look like?
It is not a single hierarchical tree. It is part of a maze; some will say it is a graph. It is a connected network of product-related information: physical components, software artifacts, service elements, configurations, requirements, and operational data. It supports multiple synchronized views without duplicating information. It evolves as products change when operated in the field.
Most importantly, it is not owned by one department. It becomes a shared enterprise asset – with shared accountability for various datasets. But we should not abandon the BOM concept. On the contrary, the BOM remains essential and managing BOMs consistently is already a challenge.
But its role must shift from being a collection of static structures to becoming part of the digital product definition infrastructure, extended by a logical product structure and beyond – the MBSE question.
The BOM is not dead. But the traditional BOM mindset is no longer sufficient. The question is not whether the BOM will change. It already is. The real question is whether organizations are ready to change with it.
Conclusion
Inspired by various BOMversations and AI graphical support, I tried to reflect the business reality, observed for over 10++ years. Technology and the Academic truth do not create breakthroughs in organisations due to the big legacy and fear of failure. Will AI fix this gap, as many software vendors believe, or do we need a new generation with no legacy PLM experience, as some others suggest? Your thoughts?
p.s. My trick to join the BOMversation without being thrown from the balcony 🙃







In early December, it became clear that Rich would no longer be able to support the PGGA for personal reasons. We respect his decision and thank Rich for the energy and private money he has put into setting up the website, pushing the moderators to remain active and publishing the newsletter every month. From the frequency of the newsletter over the last year, you might have noticed Rich struggled to be active.
product or start an alliance, the name can be excellent at the start, but later it might work against you. I believe we are facing this situation too with our PGGA (PLM Green Global Alliance)
Whether a business delivers products or services, most of the environmental impact is locked in during the design phase—often quoted at close to 80%. That makes design a strategic responsibility not only for engineering.
Green has gradually acquired a negative connotation, weakened by early marketing hype and repeated greenwashing exposures. For many, green has lost its attractiveness.

When reading or listening to the news, it seems that globalization is over and imperialism is back with a primary focus on economic control. For some countries, this means even control over people’s information and thoughts, by restricting access to information, deleting scientific data and meanwhile dividing humanity into good and bad people.



December is the last month when daylight is getting shorter in the Netherlands, and with the end of the year approaching, this is the time to reflect on 2025.
It was already clear that AI-generated content was going to drown the blogging space. The result: Original content became less and less visible, and a self-reinforcing amount of general messages reduced further excitement.
Therefore, if you are still interested in content that has not been generated with AI, I recommend subscribing to my blog and interacting directly with me through the comments, either on LinkedIn or via a direct message.
It was PeopleCentric first at the beginning of the year, with the 

Who are going to be the winners? Currently, the hardware, datacenter and energy providers, not the AI-solution providers. But this can change.
Many of the current AI tools allow individuals to perform better at first sight. Suddenly, someone who could not write understandable (email) messages, draw images or create structured presentations now has a better connection with others—the question to ask is whether these improved efficiencies will also result in business benefits for an organization.
Looking back at the introduction of email with Lotus Notes, for example, email repositories became information siloes and did not really improve the intellectual behavior of people.
As a result of this, some companies tried to reduce the usage of individual emails and work more and more in communities with a specific context. Also, due to COVID and improved connectivity, this led to the success of
For many companies, the chatbot is a way to reduce the number of people active in customer relations, either sales or services. I believe that, combined with the usage of LLMs, an improvement in customer service can be achieved. Or at least the perception, as so far I do not recall any interaction with a chatbot to be specific enough to solve my problem.




Remember, the first 50 – 100 years of the Industrial Revolution made only a few people extremely rich. 


Note: I try to avoid the abbreviation PLM, as many of us in the field associate PLM with a system, where, for me, the system is more of an IT solution, where the strategy and practices are best named as product lifecycle management.



















Combined with the traditional dinner in the middle, it was again a great networking event to charge the brain. We still need the brain besides AI. Some of the highlights of day 1 in this post.








However, as many of the other presentations on day 1 also stated: “data without context is worthless – then they become just bits and bytes.” For advanced and future scenarios, you cannot avoid working with ontologies, semantic models and graph databases.








The panel discussion at the end of day 1 was free of people jumping on the hype. Yes, benefits are envisioned across the product lifecycle management domain, but to be valuable, the foundation needs to be more structured than it has been in the past.
Probably, November 11th was not the best day for broad attendance, and therefore, we hope that the recording of this webinar will allow you to connect and comment on this post.














With all these upcoming events, I did not have the time to focus on a new blog post; however, luckily, in the
Over the last month, I have been actively engaged in the field; however, unfortunately, I have not been able to respond to all the interesting and sometimes humorous posts in my LinkedIn stream.


Initially, the Bill of Materials (BOM) existed only in ERP systems to support manufacturing. Together with the Bill of Process (BOP), it formed the heart of production execution. Without a BOM in ERP, product delivery would fail.



However, is the sBOM the real solution or only a theme pushed by BOM/PLM vendors to keep everything within their system? So far, this represents a linear hardware delivery model, with BOM structures tied to local ERP systems.
As I mentioned earlier, during the Dutch PLM platform discussion, we had an interesting debate that began with the question of how to manage and service a product during operation. Here, we reach a new level of PLM – not only delivering products as efficiently as possible, but also maintaining them in the field – often for many years.


[…] (The following post from PLM Green Global Alliance cofounder Jos Voskuil first appeared in his European PLM-focused blog HERE.) […]
[…] recent discussions in the PLM ecosystem, including PSC Transition Technologies (EcoPLM), CIMPA PLM services (LCA), and the Design for…
Jos, all interesting and relevant. There are additional elements to be mentioned and Ontologies seem to be one of the…
Jos, as usual, you've provided a buffet of "food for thought". Where do you see AI being trained by a…
Hi Jos. Thanks for getting back to posting! Is is an interesting and ongoing struggle, federation vs one vendor approach.…