Robert Foy, Former Director of Professional Services and Strategic Accounts, Asyst Technologies, Inc.
Andre Crump, Senior Director of Software Product Marketing, Asyst Technologies, Inc.
Take a tour of manufacturing in the world today and you will encounter widely varying levels of sophistication across a broad spectrum of industries. From the nanometer-level geometries of semiconductor to the careful recipe and batch purity controls of pharmaceutical, all industries, regardless of the underlying technology, struggle with the challenge of managing their businesses to a standard of simultaneous high profits and uncompromising product quality. Meeting these dual challenges requires good basic business processes in addition to an appropriate “fuel” to drive those processes. It is no surprise then that information is a key ingredient in any successful manufacturing industry’s high-octane diet.
Transferable data has existed since the first chisel was placed to a stone tablet; however simply getting more information is not enough. What today’s industries need in order to be competitive is relevant, specific and organized information. Information that is not orderly, validated, and coalesced often turns out to be exactly the opposite of helpful. Like an army without a command and control structure or a verifiable and hierarchical source of intelligence, businesses with information that is not relevant and organized will find predictable efficiency and profit margins quite elusive.
How does one define the phrase “relevant information”? The dictionary provides some revealing clues: Having a bearing on or connection with the subject at issue; pertaining to; germane; useful to score a solution.
Relevant information is thus by definition connected to the topic at hand, and can be considered a subset of all available information as a whole. Relevant information highlights problems, determines causes, and enables solutions. It is the first stepping stone on the path to better results. Put another way, effective solution determination consists of appropriately narrowing the scope of a problem and then investigating its possible causes. The proper starting point therefore is to decide what information is relevant, determine what else you need to know, see what relevant data has not been gathered, and confirm what irrelevant paths you can eliminate.
The road to relevancy however has several precursors. Before information can be relevant it must be accurate. In turn, before it can be considered accurate it should be authenticated. Of course, before it can be authenticated it has to be accessible. Taking this logic to its inevitable conclusion, it can be said that before information can be accessible, it must originate-in other words, it must have origin.
It is no coincidence that many current cross-industry initiatives to improve manufacturing profit and quality focus on the application of the concepts of origin and accessibility. These are two of the foundational tenets of the doctrine of relevant information. Consequently, in investigating these two factors it has been demonstrated that new systems can apply them with great success. In fact, emerging equipment information-bridging technology is now able to break down cross-pollination barriers across industries, and thus open up opportunities for greater manufacturing process efficiency and improved quality solutions. This new software technology can also be used to assimilate and represent a wide variety of information types, as well as provide real-time access to active streams of distributed equipment data. All of these benefits are integral to the requirements for relevant information.
Going to the origin
The basis of new manufacturing technology solutions rests on the developers’ and users’ understanding of the principles of relevant information. For example, let us consider in more detail the issue of origin. To have relevant information, a question one needs to answer is: What is the origin of the information? Depending on the industry, that answer will vary. In the medical profession, some practitioners may consider animal models, clinical trials and post-treatment patient result histories to be the most significant origin of information. In drug manufacturing, purity and testing protocols, in addition to batch recording and reporting, may comprise parts of the data’s origin.
Modern manufacturing, unlike many other business activities, necessitates the clear identification of origin for information to be at its most relevant and useful. In practice, that origin comes from the equipment, tools, and systems that perform the work of the manufacturing process. Manufacturing, more than any other endeavor, is built upon machines, memory and mechanization. The machine aspect is obvious, but the memory is usually that of either human beings or storage systems, neither of which is often able to operate in absolute real-time, flexibly, and with continued accuracy. In the past, memory has been the primary origin of certain types of information, a reason some would say for the degradation in its relevancy. Mechanization however is the preferred “go to” source or origin for the information that all modern manufacturers must successfully master and manipulate on a daily basis. In these situations, relevant information originates from questions such as: How is the process running? How busy are the tools? What’s the current variability in the process? What’s the quality of the products being made? How much product is complete? What is my capacity utilization? Where are various lots in the process? What is their composition? What’s still under way? and perhaps most importantly, How much can be made, and at what cost margins?
The challenge is not so much that of knowing how to answer these questions given the right information, but rather that of knowing how to be given the right information in the first place. The answers to all these questions and more are contained within the origin of relevant manufacturing information (RMI).
Relevant manufacturing information
So what is the problem here? Simply put, most industries are challenged by mechanization that doesn’t gently relinquish its grip on valuable data. A wide range of tool types, processes, protocols, requirements, technologies, and OEMs means the data that does reside in the mechanization is as varied and different as its sources. Standards have been attempted in different industries to address this problem. For example, in semiconductor it might be SECS/GEM and in automotive it might be MAP. In pharmaceuticals and biotechnology the quality and safety of the end product is paramount, so on top of the standards we see a number of requirements from the FDA, governmental rules such as 21CFR Part 11 (Title 21 Code of Federal Regulations for Electronic Records; Electronic Signatures), and of course ISO and ISA. In addition to these are other solution attempts including CIP, TOP, DeviceNet, SensorBus, MODBUS, and EtherNet/IP, to name a few.
All of these protocol standards have their individual merits and weaknesses, and each deserves to vie for its respective role in controlling the packaging of data. However, the very existence of such a long and varied list of protocols also means that manufacturing industries must accept the truth that no single protocol can reign over all. One can even argue that it is necessary for a diversity of protocols to exist in order to accommodate the unique characteristics and environments of the varied origins of the information, as well as to continue to fuel innovation, although in some cases it creates inefficiency. Still, while a factory’s mechanization and the associated origins of data may represent a diverse group, the mission for the business behind it is singular in its purpose: operate efficiently, maximize profits, reduce costs, and improve the value of the information supporting the first three objectives.
A technology solution
New software technology is becoming available that bridges the heterogeneous world of information origins and assimilates them into a common platform from which ease of use, accessibility, and relevancy can grow. Fundamental to this equipment information-bridging technology is the concept that no protocol is left behind. Having its genesis in the wildly diverse semiconductor industry, the bridging software has been designed from the start to embrace the widely varied array of information protocols already plaguing that industry. Even more significantly, the bridging technology creates a level playing field by not assuming as definitive any particular protocol.
The immediate effect of this technology is that it will enable faster access to relevant information. Users will have already defined what desired data is encompassed in the manufacturing process, but often will have been unable to retrieve it in conditions which maintained its relevancy. For example, the information may have arrived too late to effect current batch outputs, or the origin of the information may have rendered it useless. These situations would have resulted from the Wild West environment of protocols, or lack of protocols, in which they operated. Fortunately, what some manufacturers may have once considered a miasma of protocols is now being reconstituted into a tangible solution into which any data can be incorporated, regardless of source. If users were once required to ask: Http versus SECS/GEM? or process tool A versus quality tool B? they now find the question irrelevant to the results. They can have it all: multiple protocols and multiple sources as well as multiple clients.
So now, having the origin of our information under control we can turn our attention to the next requirement for relevance: accessibility.
Accessibility is defined as 1) the quality of being at hand when needed, and 2) the attribute of being easy to manage. One sticking point with possessing the origination of your information is that it still does not always make the information accessible. Air traffic controllers around the world have the origins of their relevant data under tight control, but like our multiple protocols of data, the originating pilots and countries may speak many different languages. For the purpose of accessibility, most international air traffic controllers all subscribe to one model of communication: English. In a similar fashion, equipment information-bridging technology takes the information from various origins under its control and then models the data into a universally accessible form.
Models exist in all fields, and modeling has always been the best way to manage and make accessible complex environments. Models are made up of real or theoretical objects and their respective processes. A scientific model is a set of ideas that describes a natural process. Architects use models to understand complex building plans. Biochemists use models to help visualize the various compounds they devise. Modeling is greatly used in genetics, and without models, DNA discoveries would be impossible. The U.S. Human Genome Project began in 1990 and relied heavily on the concept of information modeling in order to manage its monumental task. Since then, modeling continues to be embraced across the research and business spectrum for purposes ranging from drug development to grid computing.
Figure 3. Equipment-bridging technology takes the information from various origins and protocols and presents the data in a universally accessible, relevant and cost-effective form, bypassing logjams and time lags.
In a similar fashion, emerging information-bridging technology effectively leverages the concept of models to manage various data. Stemming as it did from the protocol-rich semiconductor industry, the technology has been implemented to accommodate a growing variety of communication and connectivity standards, principally SEMI’s E120 (Common Equipment Model), E125, E132 and E134, and following the XML guidelines as established by SEMI E121 and E128. However, as with its handling of various protocols, the information-bridging software was designed to model not only semiconductor data and equipment, but also data and mechanized processes across any industry where it is needed, including pharmaceuticals, biotech, aeronautics, and automotive.
Models in this new methodology are essentially the common language into which all the different data are combined and assimilated. These models are XML structures that describe in an intuitive way the structure and organization of the origins of the information. In semiconductors this might mean an implant tool with multiple chambers, various subsystems, and a host of variable, parameter, and measurement data associated with each. In pharmaceuticals and biotechnology it could mean the equipment
required to validate sterilization processes, or the clusters of equipment needed for specific types of organic synthesis. Since the software actually models systems of machinery and data according to object-oriented precepts with hierarchical structures and organizations, the data becomes what is referred to as discoverable. This is just another way of saying that data represented inside the model is accessible, intuitive, and well on its way to being relevant.
Data models of processes and clusters
One of the great benefits of using data models with equipment information-bridging technology is that they allow a variety of incremental capabilities. For example, these solutions are not limited to making and implementing models of a tool and its subsystems. Models can actually be made of entire groupings, or clusters, of tools and their subsystems used in a process. In this way, the available data represents the activity as a whole, relevant at a granular level, but also relevant as an entire process. One advantage of this ability is that users can examine variations related to materials handling, or they can reduce the amount of setup time between activities, or employ e-diagnostics applications that can examine the entire tool cluster simultaneously, as opposed to one device at a time. Because, however, the model also represents each tool and its subsystems, users can examine with pinpoint accuracy any area of interest.
The advantages of modeling continue further. Levels of automation and quality control that previously were difficult, time consuming and expensive in terms of integration and configuration suddenly become much more possible. Two pieces of sterilization equipment, for example, that work in tandem but use different protocols, would realize a much higher level of efficiency and effectiveness if modeled and accessed together.
Greater automation, quality and control
Equipment information-bridging technology allows immediate, real-time access to information origins. The new technology ensures that data has already been authenticated, and because it comes directly from the tool or process, there is a high probability that it is also accurate. These elements combine to provide a leaner, meaner, and more efficient manufacturing process control functionality, one capable of effectively leveraging additional solutions such as SPC, APC, e-Diagnostics, and Run-to-Run control. It also enables the level of reporting granularity that certain industries are required to fulfill. The data can even be seen as a form of web services, one in which authorized applications can access as needed.
When all is said and done, new technology and data models will lead manufacturers to higher levels of profitability and competitiveness based on their ability to have access to information that is accurate, authorized, and relevant, regardless of protocols. This is the high-octane fuel that businesses of today require. III
Robert Foy is the former Director of Professional Services and Strategic Accounts in the Connectivity Solutions Group at Asyst Technologies, Inc. He can be contacted at firstname.lastname@example.org
Andre Crump is the Sr. Director of Software Product Marketing in the Connectivity Solutions Group at Asyst Technologies, Inc. He can be contacted at email@example.com