PLM Quick 30: The Origin and Evolution of PLM (ft. Jim Roche)

by | Apr 25, 2019

On this episode of the PLM Quick 30, we welcome Jim Roche from CIMData. Jim’s years of experience and expertise in the Aerospace and Defense industry are what enable him to provide novel insights on the history of PLM and how it evolved throughout the years. Access the full podcast below.  

 

Contact us if you’re ready to discuss a dynamic PLM strategy.

Transcript

The transcript is close to a literal transcript of the spoken word. Please excuse any grammatical errors, spelling errors or break in the flow. The podcast is a non-scripted conversation with natural flow aimed to deliver value. The transcript is generated by a computer.

Patrick: Hello, welcome to ArcherGrey’s PLM Quick 30 where we discuss all things PLM. I’m your host Patrick Sullivan and today’s guest is Jim Roche from CIMData. Jim’s an expert in PLM and he is also an expert in the Aerospace and Defense industry and we had such a great conversation with really good information that we changed the format and split this podcast into two parts, two separate podcasts. Part one, we deal and we dive into how PLM has transformed throughout the decades and we actually talked about the origination of it in 1959 and that lays the foundation for part two. With all of these growing software and technology, there is a growing need for standards related to how the software and the processes are configured and implemented and so, CIMData and Jim are leading the Aerospace and Defense PLM Action Group and their mission is to help set those standards. So you don’t want to miss part one because it lays the foundation for part two and you certainly don’t want to miss part two because it will eventually influence how you are pursuing your PLM initiatives. So without further ado, Part one of the ArcherGrey PLM Quick 30.

***

Patrick: You are really at the forefront of developing new standards and ways of really Fortune 500 companies to embrace the concept and implement it globally.

Jim: I think a lot of people are working beyond. They look at it in a similar way. They know that they are doing something that is really fundamentally important to their company and what’s the challenge is today is really taking that intelligence for that information flow and being able to automate it.

***

Patrick: Hello, everybody. Welcome to ArcherGrey’s PLM Quick 30 where we talk of all things PLM and I’m really excited to have our guest today, Jim Roche from CIMData. Welcome to the podcast, Jim.

Jim: Thank you, Patrick. Happy to be here. Looking forward to it.

Patrick: Yeah, great. So why don’t we take a few minutes here to get to know you a little bit for people that aren’t familiar with you and CIMData but let’s start with you first? If you could just take a couple of minutes here and give us an introduction of who you are and what you do for CIMData.

Jim: Okay, very good. My background extends over about going on nearly 40 years now and I started with University Studies and Physics and when I left the university, I took a job as a Machine Tool Designer and worked in manufacturing engineering for a small machine tool company. It turned out that being a Machine Tool Designer was really Mathematician and Computer Programmer because the gentleman that founded this company was a frustrated Aeronautics Engineer and he had a really super idea about how you could do metal forming in a way that it had never been done before but he needed a lot of Mathematics and computer models to do that. Anyway, I enjoyed that work very much. I left that job and went to work for a larger machine tool company and after that I left and went to General Motors at their Manufacturing, Engineering Advanced Development Center at the GM Tech Center and I was happy with my life as a Manufacturing Engineer and went through a number of different assignments and was working in the area of Computer Integrated Manufacturing when General Motors bought EDS and overnight I became a data processing person working for EDS but I enjoyed that. I thought it was great. GM was my customer. I was still doing Computer Integrated Manufacturing and through a number of twists and turns, went from Computer Integrated Manufacturing to the position of Chief Architect for General Motors’ modernization of their product development environment globally. I did that for a couple of years. Getting that program up and running to the point where they were then in execution. At that point, my boss asked me what I wanted to do and I said, “Well, I really like the idea of Engineering Data Management” which is what we call it at that time. So we collected all the resources and we’re delivering basically a PLM into the engineering environment at General Motors and then flowing it down into manufacturing. Those people were then assigned to me and we’re delivering that capability to GM and started selling it to other companies outside of GM including Mcdonnell Douglas for the FAP and then other Aerospace projects. At that point, GM EDS decided they wanted to get into consulting so they bought A.T. Kearney. I moved to A.T. Kearney and started up a practice for them. After a few years, I left A.T. Kearney and went to Computer Science Corporation, started up a PLM practice for them. One of my first accounts there was Lockheed Martin, now what we called at that time the GSF Program. I was responsible for the team that was re-engineering how product development would be conducted at the GSF Program which turned into the App-35 and a number of other interesting projects including Rolls-Royce in the UK and Proton Motor in Malaysia and United Defense which was mobilized howitzer in Minneapolis. At that point in my career, I was spending two weeks in Darby, England; two weeks in Minnesota; and then two weeks in Kuala Lumpur and did that for about 18 months. I went to work, after that, with, what eventually became Siemens PLM, but at that time it was part of EDS, was the consolidation of what had been unigraphics and their acquisition of SDRC into a business unit for EDS. I was responsible for leveraging that business into the broader EDS business. They sold it to a venture capitalist investor firms and then eventually sold it to Siemens and number of interesting assignments in Aerospace and Defense and other industries as well. I remember working for Church and Dwight which is a manufacturer of Arm & Hammer and tried to figure out how to do PLM for products made from baking powder, baking soda and working for Medtronics, figuring out how to use PLM for pacemakers but that’s a very long story. I apologize for dragging it out so much but I retired in 2009 after a 30-year career and two years later, decided I wanted to get back to consulting and that’s when I joined CIMData and got back into providing strategy and solutions selection and setting up major PLM programs for a number of commercial companies which is what I really enjoy. That’s my passion.

Patrick: Yeah. I mean that sounds like, well, first of all, thank you for going through the history. Sounds like an amazing career where you’ve been on the forefront and then the evolution of and I’m going to generalize the term, just saying PLM. I know it’s common language today but you’re really at the forefront of developing new standards and ways of really Fortune 500 companies to embrace the concept and implement it globally. Looking back, it seems like you are at the forefront throughout your entire career.

Jim: Yeah, it turned out to be that way. I really enjoyed it and I think a lot of people are working beyond. They look at it in a similar way. They know that they are doing something that is really fundamentally important to their company. They know that it’s something that the company does. I always used to say that you can do PLM without computers. In fact, every company that has a product does PLM. The primary component of a successful PLM system is the smart people who know what information is required and where to get it from and who to pass it to and they really understand the intricacies of that system and what the challenge is today is really taking that intelligence for that information flow and being able to automate it and provide that underline repository of information where the key elements of the product information are captured and protected and related appropriately to each other. It’s like a puzzle and it’s a puzzle that really is the heart of the success of the company. So I think people who like to do puzzles and puzzles that mean something are drawn to PLM.

Patrick: I love that analogy. I tried to come up with an analogy frequently and some are good and some fall flat but that one’s definitely a good one. We had done a podcast with the Managing Partner and CEO of ArcherGrey about a month ago and the topic of that, which by the way you can find at ArcherGrey’s website in the resources and podcasts section of the website. It was titled “Deriving the Most Value From Your PLM System” but really what we were discussing was kind of the evolution of PLM. Jack Schroeder is the partner’s name and he had developed a Product Data Management System in the early 90s and he’s also been at the forefront of the industry and I really wanted to get his impression of trends throughout the years and I tried to get it in terms of decades. So his exposure in the 90s to the 2000s and now that we’re into this digital thread terminology. How PLM has transformed and I love your comment about you can do PLM without computers and really the computers today are taking the intelligence of that information in automating. Thanks for sharing your perspective on that. So in relation to what I’ve just said, I’m curious to know your perspective. I mean, if we were to summarize and maybe it’s generous to go from the 80s to 90s but if you take a couple of minutes and just look at a quick glimpse of a timeline, 80s to 90s, 90s to 2000s and beyond. How would you take a few minutes to summarize the evolution of that because I think your analogy of a puzzle is perfect?

Jim: Yeah, okay. Well, it’s interesting because I talked about funding this program to modernize their product development on a global scale and that started in the late 80s and I was interviewed after we had to find the architecture for the solution. We were in the process of selecting our strategic suppliers that will actually deliver the solution and in that interview, I was asked, at that time we called it art to part, that was a popular term and art to part included the idea of going in the automotive industry from a clay model and rendering that not as a clay model but it surfaces within computer rendering of the body shape and then going from that to sheet metal design and then adding the additional elements to the periphery of the sheet metal shape and designing the internal panels and then going from the design of the parts to understanding the metal flow characteristics so that you could then unfold it to a flat sheet but then design a series of dies that would form it into the design shape that you needed and then taking the dye design and driving numerically controlled machinery to inspect the dyes and then inspect the panels and then drive the robotics for welding the panels together to come up with the automotive car body and that was called art to part and the data that was generated, they need to these stages was called the Data Pipeline. So, the question was where did that originate and I answered the interviewer and said it came out of the GM Research Lab in 1959, 1959 they had this concept, and they went to IBM and they asked IBM to invent the computer systems of General Motors to do this art to part and IBM said “Absolutely no way. It’s impossible. It can’t be done” and at that point, the folks at the GM Research Lab said, “Oh boy, that’s just the kind of thing we want to take on” and they started developing the corporate graphics system that actually started doing that work. Anyway, fast forward a little bit, within Aerospace, they were facing a similar type of issues and the early PLM solutions came out of Aerospace, actually.

Patrick: So Jim, if I could just interrupt you for a second-

Jim: Sorry to spend the whole time talking about this stuff.

Patrick: Well, no, it’s fascinating. I’m glad that you’re talking about it. I love the term art to part but you had started the segment on the General Motor thing and I appreciate the background because I think it really brings it to life. You had mentioned that you were dealing with this in the late 80s around, it could make question naively. I brought it up the 80s to 90s but so, I believe the timeline that you’re referencing is during the 80s but it really originated from GM in 1959. Is that correct?

Jim: That’s the only historical reference I could find and that was when IBM said it was impossible and GM research lab said, “We’re gonna do it”.

Patrick: Great.

Jim: And they started developing the Corporate Graphic System and probably within about 10 years, they were modeling sheet metal and they were then unfolding the sheet metal and starting to model the dies that would form the sheet metal.

Patrick: Okay, great.

Jim: That goes back, so the program, and I think the 80s is a good place to start, Patrick, because at that point if you consider what GM was trying to do, they wanted to consolidate from 14 CAD systems to three. That was a big goal. So now you look at it and people say jeepers. Most companies now are standardized at one CAD system but at that time GM was using everything: Calma, CADAM. That point it was called Unigraphics. Catia was there and so they had this big bake off but the point I wanted to make is that all these systems came out of Aerospace. Catia came out of Dassault Aviation. I think CADAM came out of another one. Unigraphics came out of the Mcdonnell Douglas and I’m sure I’m missing some computer vision. These all came out of different Aerospace companies and so they were the leaders in developing these internally and GM had the Corporate Graphic System. By the time the 80s came along, these companies were saying that the systems are too big, they’re too expensive, we can’t sustain it on our own so we need to go commercial. So they started these commercial divisions and try to sell their software to the broader community that would sustain the investment. At that time, there was really only one significant commercial solution in the space of Engineering Data Management, that was Sherpa. It was another product that was developed by HP called Works Manager, I believe, and then in the 90s, you round up with a company called Matrix that came out with a really really super duper advanced capability that started to take hold but the point I wanted to make regarding the changes over the decades is that the mid to late 80s is when you started to see commercial products emerging that was coming out primarily Aerospace and then in the 90s, you started to see some very viable commercial products beyond Sherpa and we had Info Manager that came out of UGS and others and there was a very capable offering from IBM called Product Manager and that never really took off. It was very robust but it was very difficult to use. To use the interface was very traditional, mainframe, blah, blah, blah and IBM didn’t really support modernizing that. That roundup being a Dassault product for a while. Matrix was acquired by Dassault. So there was a lot of turmoil in the 90s in terms of capabilities. You had a DMCS from SDRC. You had Info Manager from UGS. You had, still had Sherpa and then you had the people that spun off from the SDRC organization, Jim Heppelmann, and they started up their own company and they came out with a product that was later, they were then acquired by Computer Vision and that was the birth of Windchill and then Computer Vision when it was acquired by PTC. So things started to settle down after 2000 and the product started to become more robust and they started to become more competitive. So I’d say between the 90s and 2000s. We saw the products that supported the, what we used to call in the art to part days, Data Pipeline becoming more robust and really offering the possibility of integrating and managing not just engineering files and managing the access to the engineering files and change control of the engineering files but actually starting to expand to be able to manage product structure and formal release in the enterprise level. So that came in the 2000s and after that as the manufacturing applications started to expand then the software that managed the product information started to become richer and being able to manage not only product definition but manufacturing process bill of material. Where we are now, in this last decade, in 2010, 2020 is we’re seeing another explosion in the extension of the lifecycle to include the area of service and if you get into operational product IOT, Internet of Things, to be able to actually monitor capability and operational characteristics of the deployed product and then the information environment or platform and I think there’s more discussion now. The idea of the platform for managing the content and the activity against the product information. That architecture has really matured over the past three to five years. I wouldn’t say that we really have robust platforms now but we’re getting to the point where we’re going to have robust platforms that manage all of the product information across the lifecycle from the initial requirements, to the product definition, to the manufacturing definition, manufacturing execution and then product support. So that platform and then attached to that would be the applications that consume that product information. So that’s a new architecture that is really, and one of the things that we find is that vision leads the actualization by at least 10 years. So the idea of having these applications that consume the product information being able to plug into these information resources, controls and manages the content is generated and accessed that really being into merged as an architectural concept around the 2000s but we’re just now seeing relatively respectable product implementations that realize that architecture now.

Patrick: Yeah, it’s interesting that that comment of the idea of realization about 10 years. One word that I’ve always thought of related to PLM and kind of its evolution that have been exposed to here over the last five years is Utopia and if we can say that PLM was kind of coined in the 90s, the definition was established but I would say, within the last five years, we’re finally getting to a point what was originally envisioned to be PLM and it wasn’t just managing the product data, it was managing the lifecycle of the product, right? The full lifecycle and so now as we look to these new visions that are being established around the digital thread and we’re finally at a point where the hardware can support these ideas that people are having. So this realization timeline of 10 years, I totally agree. I’m excited about coming more expeditiously than from the 90s to where we are today, 20+ years. So thanks for that walking through those by decade. I’ve been taking some notes. I’ve got the art to part and data pipeline to the evolution of product structure, to the evolution of managing the bomb and then extending it into the lifecycle to include service and obsolescence.