Stat-Ease hosts the premier annual conference on practical applications of industrial design of experiments (DOE). Check back in 2023 for information about the next DOE conference.
All our presenters' abstracts are listed below, along with links to their bios & slides, and the recording on our YouTube channel. Or, click the link below to go straight to the playlist on YouTube!
2022 Online DOE Summit Playlist on our channel: Statistics Made Easy by Stat-Ease
This presentation spells out a fun physics experiment that illustrates the advantage of multifactor testing over the traditional one-factor-at-a-time (OFAT) scientific method. See how Mark and his grandson Archer uncovered multiple interactions that surprisingly canceled out OFAT main effects. Their home experiment on bouncing balls affirms the application of DOE for building profound data-driven process knowledge. Attend this talk for the fun and the education!
Setting specifications is often a difficult problem. Here we use a real-life example to illustrate how specifications on inputs called process windows can be set statistically to ensure a given performance on a response. The problem is a crack issue and involves a hard-to-vary mold temperature factor. Measurement systems are checked and data mining used to get clues for the problem. Then Design-Expert is used to create an optimal split-plot DOE to assess the drivers of the crack issue with five conditions to accept a conclusion in DOE given. Armed with this knowledge, a procedure is carried out on a second quadratic split-plot design for setting process windows on the inputs. Since specifications are usually set on individual units rather than averages of units, the contour plot for the average response needs to be converted into a contour plot for the individual response values, so that the process windows can be read off. The procedure is general enough to handle 2 and possibly 3 input factors to determine statistically based process windows in Design-Expert.
Design of Experiments is a powerful tool that combines scientific knowledge with the rigor, effectiveness, resourcefulness, and reliability of statistics. When used in the pharmaceutical industry, it’s a tool that will save time and resources and allow for the best outcomes from the experiments that will be used in decision-making. One such area where design of experiments is commonly used is process characterization. Process characterization is performed to provide objective evidence that critical product and process parameters can meet a high standard of product consistency and quality. This activity should be undertaken when some basic level of process understanding is complete. Process characterization designs are the product of a comprehensive risk assessment that includes the quality target profile and subject matter expertise to understand what factors should be studied to deliver on the promise to patients of high-quality products. This talk will focus on the steps taken to plan, design, and analyze a process characterization exercise with the hope of achieving some understanding of the proven acceptable range for each unit operation for a multi-unit process, such as a cell-based process with up- and down-stream components.
The 2012 Clint Eastwood movie “Trouble with the Curve” features an up-and-coming next great baseball slugger. We find later that the hitter has trouble hitting a curve ball. It seems that he has managed to avoid dealing with the curve by hitting the fast balls so hard that the best pitchers want to get a fast ball by him. In today’s digital world, data is increasingly recorded in the form of curves rather than just a single point or average value. Unfortunately, we often find ourselves taking the easy way out by summarizing the curve into one summary value such as the first or last point, or value at a certain x. Examples include:
In this case study we will see that when each run of a DOE produces a curve, we can utilize non-linear fitting of the entire curve as a way to extract more information about the curve. Many times, these models are known theoretical models that match the behavior of the response. We will be able to extract parameters such as reaction rates, starting points and asymptotes and model those parameters to develop a prediction equation allowing conditions to be found that will produce a curve with the desired shape and maybe even one that was not exactly produced during the designed experiment. Stat-Ease 360® provides an easy link to Python® where the raw data can be passed, non-linear models produced, and key parameters fed back to Stat-Ease 360® ready to model. At the end of this presentation, users of this technique will no longer avoid the curve or have trouble with it. Instead, they will embrace it and appreciate the power of the curve and the extra information that can come from utilizing all the data rather than a one statistic summary only.
Join us on our quest for the Holy Grail. It’s not messianic, just a quick, simple and appropriate way to identify an adequate model for data from Definitive Screening Designs (DSDs)!
The release of Stat-Ease 360 last year was exciting for many reasons; not least the integration of Python scripting, which offers the Daring Software Developer limitless options to solve Dastardly Statistical Dilemmas…
In this talk, we will revisit DSDs - introduced by Jones and Nachtsheim (2011). Their special 3-level, fold-over and alias-optimal structure provides experimenters with many resource efficient desirable combinatorial properties. However, DSDs are supersaturated designs, which make their analysis with partially aliased second order effects more complicated and often employs generic (Automatic Selection) regression methods. These or more conservative approaches do not take advantage of the useful structure they possess.
We will demonstrate how SE360’s Python integration offers flexibility and extensibility, and has allowed us to implement a recommended design-oriented analysis method that fully exploits the unique structure of the DSD. We will see how Dynamic Software Development can rapidly augment SE360’s existing functionality, and allow us to build user-friendly tools that steer the experimenter towards Decent Statistical Decisions.
Mixture design of experiments (DOE) works wonderfully for optimizing formulations for food, paints, pharmaceuticals, chemicals, etc. Yet, many researchers remain fixated on DOE tools developed for process factors (e.g, time and temperature). By doing so, they create suboptimal experiments and miss nuances provided by models specialized for mixtures. In this webinar, Martin will lay out many common mistakes made by formulators aiming to optimize their product recipes. Via several industrial case studies, he will explain how mixture DOE overcomes these pitfalls. Join him for practical presentation (no theory!) packed with tips and tricks that you can immediately apply to catalyze your R&D.
The USDA Horticultural Research Laboratory, Fort Pierce, Florida conducts research in breeding and genetics, entomology, plant pathology, post-harvest physiology and fruit chemistry to solve agricultural problems in the U.S. This talk will describe five examples from laboratory, greenhouse, and field studies of in vitro breeding/genetics, horticulture, and entomology research at the USDA lab and the DoE approaches used. The DoE designs used include fractional factorials, response surface, mixture, and mixture-amount designs.
Two examples to improve the in vitro growth of citrus will be shown. In vitro technologies for horticulture, plant breeding, or genetic applications often require improving the growth or physiological response of a particular cell, tissue, or organ culture system before it can be used for crop improvement applications such as ploidy manipulation, cell selection, embryo rescue, genetic engineering, or mutant creation and isolation.
One greenhouse horticulture example to improve the bud grafting efficiency of greenhouse grown citrus will be shown. Because all citrus trees are grafted trees composed of a rootstock and a scion, budding efficiency is important for the citrus nursery industry that produces the grafted trees planted by citrus growers.
Two examples from entomology will be explained. The first example will illustrate the development of improved diet formulations used to grow the Diaprepes root weevil in our insectary, a facility for rearing insects. The insectary cultivates the insect pests used in our in-house research programs by the entomologists. The second example will illustrate the development of a synthetic pheromone mixture for both attraction to field traps and orientation disruption of the citrus leafminer, a major pest of citrus worldwide. Many insects use pheromones to communicate and recognize potential mates. Although some species use a single compound as a sex pheromone, most deploy a blend of compounds in species-specific ratios. The production of synthetic sex pheromones to disrupt mating in insect pest species is an important industry that contributes to reduced insecticide use in integrated crop protection strategies.
After decades of continuous development, Design-Expert® software (DX) leads the field for making design of experiments (DOE) easy for scientists and engineers. In response to many requests from loyal users to expand our statistical toolkit, we are proud to now produce Stat-Ease® 360 (SE360). With DX at its core, SE360 adds advanced features for statistics and engineering, as well as sophisticated application-programming-interface (API) tools. This webinar provides a briefing on the major innovations now available with SE360, and bit of what's in store for the future.
With globalization, chemical industries must be constrained to invest in research to try to minimize time and costs and produce the greatest yields, productivity and quality. The use of statistics techniques, such as experimental design (DOE), associated with a chemical insight, allows us to increase the knowledge of the processes in study and solve many problems.
In this work, I will illustrate some examples of application of the Design Expert software. I used factorial analysis in nitrations of usual aromatic compounds, like benzene, bromobenzene, chlorobenzene and phenol. The variables studied were temperature, time, molar ratio of HNO3/AromCompound and molar ratio H2SO4/HNO3. The evaluated answers were conversion, yield of each isomer producted and the selectivity of para/ortho isomer.
Although the products studied involve apparently consolidated processes, in all cases, I can cite some kind of improvement in the results compared to the information available in the literature, whether in conversions, yield or selectivity of some isomer.
I highlight the chlorobenzene mononitration process to selectively obtain pnitrochlorobenzene, an important intermediate in a series of dyes. This process involves the joint obtaining of the orthonitrochlorobenzene isomer, which has properties very similar to the product of interest, resulting in a sequence of costly purifications. In the literature, the molar ratio between the isomers was 2:1 (para:ortho). With the application of the software and the chemical knowledge on the subject, I managed, using the same reagents, to obtain an increase in selectivity to 15:1 (para:ortho). In this way, it was possible to eliminate a series of purification treatments to comply with the commercialization specifications of the product of interest. It was also possible to identify another experimental condition favoring selectivity for the ortho isomer, achieving 1:9 (para:ortho). All with great conversions.
The Decontamination Sciences Branch has been integrating DOE methods into many of their chemical agent decontamination research programs. Beginning in 2014, Mr. Davies led efforts to integrate Mixture-Process (Formulation) DOE into the Branch’s programs. Over the past 5 years Mr. Davies has worked with researchers to develop Mixture-Process DOE techniques to simultaneously model the influences of formulation components and process factors which has reduced experimental sample sizes by 70% to 95% and transformed the way that decontamination formulations are optimized.
Formulation optimizations that used to require several months of lab testing using intuition based or one-factor-at-a-time experimentation methods are now completed using DOE in as little as 1 to 5 days. Added benefits from DOE have been:
This presentation will highlight a recent DoD research program in which a Mixture-Process DOE created with Design Expert software was used to formulate the point of use “Sprayable Decontaminant Slurry” (SDS). The SDS program was tasked with finding the optimal blend of the five SDS formulation components that would provide the best overall performance against 6 different chemical warfare agents, on 4 different materials (the equivalent of 24 categorical levels), and over 4 continuous processing factors. Remarkably, a Mixture-Process DOE using a KCV (Kowalski/Cornell/Vining) model was able to successfully model the vast design space and deliver the final formulation using just three laboratory days of testing.
This presentation will also address the “soft side” issues encountered while transitioning an organization from conventional experimentation to DOE. Soft side issues are non-technical in nature and are associated with cultural values and long held research traditions. Examples of what George Box described as statistics being a “catalyst for scientific discovery” will be provided to illustrate how DOE allows for many hypotheses to be tested simultaneously which actually complements and in no way supplants the scientific method.
Until I get a response to my Craigslist ad, our Air Force aircraft will still need to traverse the sky searching for targets. Since sensor performance is frequently a function of sensor-target geometry, our statistical models must account for the relevant range, azimuth, elevation, etc. Now aircraft position is certainly a hard-to-change factor without our transporter, and is further complicated by being required to be in sequence. That is, long ranges are followed by progressively shorter ranges as we fly inbound towards the target and vice versa as we recede from the target. Furthermore, we occupy each range position from start to finish on each leg. The physical result is a physics-imposed full factorial design with several sequential ranges for each altitude. The resulting geometric design looks like this in Cartesian coordinates, with Y axis as altitude and the X axis as ground range. Once can also represent this design in polar coordinates as shown by the slant ranges and depression angles.
In addition to these geometric variables we have several other variables:
These variables we shall call “modal” variables and are largely easy to change. A full design prescribes both values for geometric and modal variables. The geometric variables are fixed in sequence by physics and can be considered whole plot (or hard to change) variables. What makes this design a challenge is that the whole plot values are already determined and sequenced. How shall we mate the large number of combinations of the modal variables to the fixed geometric values so that the resulting design has excellent modeling properties – orthogonality, power, low variance inflation, good prediction variance, etc? Simply building a fractional split plot design is one possibility, but such a design invariably leaves “holes” where the aircraft occupies positions when no points are to be taken – an obvious inefficiency.
This talk describes the algorithm we derived that takes advantage of several of Design-Expert’s features including building and importing custom design and using DX’s excellent evaluation features to ensure that the modeling properties are acceptable. Because the resulting design incorporates both the whole plot geometric design stitched to an (often) optimal blocked design for the modal variables, the resulting design can be viewed as a “Franken-design”, a term of art we invented to describe designs built from individually crafted pieces and merged. The author introduced Franken-designs last year at the 2021 Stat-Ease Online DOE Summit!