Hey, everyone. In this video we're going to be discussing robust design processes, but we are going to be looking at the analysis once we've got all our data. Because once you've got your data, what do you do with it? We're going to check that out. When conducting an analysis, you can compute the mean value of the objective function for each factor setting and identify which control factors reduce the effect of noise and which ones can be used to scale the response. With the Analysis of Means, plot the average effect of each factor level. I will say when you start to look at the data and how things are distributed, if you see that there are a lot of outliers and there's no real trends, then chances are that you're not accounting for all variables. You might be looking at production or manufacturing at a certain level and being like, well, I know that under these conditions it is working this way, but it's really not predictable. Then you might say, what are the factors that apply? Well, maybe the manufacturing is taking place at one o'clock in the morning instead of bright and early at say, nine o'clock in the morning, and the team working on the manufacturing, they might be really tired. There might be less people available. There might be only people with a certain skill set available at that time of the day. There are lots of different human factors that can influence what the outcomes will be that you might not typically take into account for the performance or understanding what the predictability will be. This is a lot of fun to me, because it's always some human element. The human element cracks me up by the way, it's hilarious. Step 6, selecting your setpoints. Choose settings to maximize or minimize objective function. Consider variations carefully. For advanced use, conduct confirming experiments. If you've got something that requires working within a very specific framework, then this is where you're going to really need to go back and confirm all your data. An example could be if you're developing a medication and you wanted to work with a certain percentage of people, or you want to make sure that the mortality rate isn't high at all. These things are pretty critical. Or what the certain outcome could be, based on a certain medication, you'd say, well, it'll cause a two-degree increase in body heat, and then you take a look at how can we adjust for these variables? How can we predict the outcome? From looking at it from a hardware perspective or a physical product, just because it falls outside of the range that you need for the use you originally thought of, it doesn't mean it won't meet the needs of another one. An example in this regard would be, let's just say you want to make motorcycle gloves, I make motorcycle gloves, but you know that the gloves that you've developed, they're not going to be as abrasion resistant as you'd like for a motorcycle. They're not going to have the same knuckle protection and people's hands are going to be busted so under those conditions you're like, "Well, this is a terrible motorcycle glove." But then you might look at, but looking at its resistance to heat and the size and these different variables, maybe it's a lot more suited to be a welding glove. Then you're like, "Okay, well I can go back to making a motorcycle glove later, but for now I'm going to sell this to a whole bunch of welders because it outperforms the current things on the market." When you're developing your product, your service, or your offerings, all these things, always keep in mind what other potential uses there could be for your concept. This is where I see a lot of people give up too early. They're like, "My target market doesn't like it." Well, there's plenty more fish in the sea. If that person won't appreciate you, somebody else will. Sound knowledge. Confounding interactions. Generally the main effects dominate the response, but sometimes interactions are important. This is generally the case when the confirming trial fails. Again, I like to go back and take a look at what other variables could be at play and why these are failing and what other interactions could be competing? Where might some specifications not line up and where could they be in total conflict with each other? But then, what other specific needs that you need for a target market and where can you make concessions? But then also, how can you repackage and re-brand and rent it off as another product and hit another market? Because ultimately the way I look at it is if you mess up in one area, it doesn't mean you should stop. It just means you find a new market. Same thing goes for relationships. I'm just playing. Just trying to keep you guys entertained, guys and girls. Alternative Experiment Design Approach, adaptive factor one at a time. Start at nominal levels, test each level of each factor one at a time, while freezing the previous ones at best level so far. Key concepts of robust design; variation causes quality loss. This is why when you start to run a certain amount of predictions and trials and you go through, make sure that you've got a solid data-set. If you start running way too many variations of way too many things and you might have a lot of data that you don't really know what to do with. When you're doing your design of experiments and you're trying to get your head around the data, using some data modeling tool or data visualization tool can be really useful. Having a good data scientist on your team, someone who can use Python to be able to sort the data, ranking normalized raw data, if you need to tie in more of the noise factors that can be involved. An example could be if you're developing a certain tool that would be used outside, you could tie in past data from saying Noah and look at given certain regions, this could be a key market. It'll operate really well in, say, Alaska, but it might not operate as well in Florida. Temperatures make a big difference in the performance of physical products or what adaptions you need to make so that it can perform well in Florida and Alaska. Again, if it doesn't work well in Alaska but it might work well in Florida, then you hit Florida in your strategy. Make adoptions and you re-brand it package it off to Alaska. Change the languages, services and things, you can hit a whole different market. These variations don't mean failure from all standpoints. It just means it may not work for your specific markets. Stay optimistic. Failure's not an option at all. DOE plan and Data, you can build these spreadsheets out and if you're coming up with something that's technical and you wanted to sell it to the military or to another OB vendor for a certain technology company that requires certain specifications for integration with their tools, then having spreadsheets to show that you can meet these metrics is important. Military specifications, mil specs, those have very rigid guidelines for performance, and being able to operate within those guidelines will open up a whole new potential market. Same thing is if you want to expand into another country and let's just say you've got a vehicle, having certain emissions levels might either make it possible for you to enter or make it impossible. Same thing with genetically modified food. It's just so interesting to me. Factor Effects Charts. Again, this is where you start to look at the different factors, noise variables and all this. Again, unless you're developing something highly technical that needs to be integrated with a certain technology or tool and have everything measured out. You may not need to go this far, but it's good to know. Conduct your analysis. You're going to have all your data available, and you might be concerned about, now what do I do with all this? If you aren't too sure and you've just got lots of data, there's nothing wrong with reaching out to say, a university and seeing if there are some students who have experience with Data Modeling who may want an internship. I've hired a bunch of interns in different fields to be able to help. I give's them job performance, I pay well, and then you're able to hit certain metrics and find strength scenarios that you may not otherwise had. The great thing also about doing it that way and finding younger talent is that a lot of the technologies that are coming out now that are accessible to previous generations, there are some great tools to use for experimentation and data visualization and data analysis, and data science is a hugely growing field and one that's going to be critical to just about every business concept. Thanks.