Quantcast
Channel: Project Tools | Minitab
Viewing all 403 articles
Browse latest View live

Understanding Monte Carlo Simulation with an Example

$
0
0

As someone who has collected and analyzed real data for a living, the idea of using simulated data for a Monte Carlo simulation sounds a bit odd. How can you improve a real product with simulated data? In this post, I’ll help you understand the methods behind Monte Carlo simulation and walk you through a simulation example using Companion by Minitab.

Process capability chart

Companion by Minitab is a software platform that combines a desktop app for executing quality projects with a web dashboard that makes reporting on your entire quality initiative literally effortless. Among the first-in-class tools in the desktop app is a Monte Carlo simulation tool that makes this method extremely accessible. 

What Is Monte Carlo Simulation?

The Monte Carlo method uses repeated random sampling to generate simulated data to use with a mathematical model. This model often comes from a statistical analysis, such as a designed experiment or a regression analysis.

Suppose you study a process and use statistics to model it like this:

Regression equation for the process

With this type of linear model, you can enter the process input values into the equation and predict the process output. However, in the real world, the input values won’t be a single value thanks to variability. Unfortunately, this input variability causes variability and defects in the output.

To design a better process, you could collect a mountain of data in order to determine how input variability relates to output variability under a variety of conditions. However, if you understand the typical distribution of the input values and you have an equation that models the process, you can easily generate a vast amount of simulated input values and enter them into the process equation to produce a simulated distribution of the process outputs.

You can also easily change these input distributions to answer "what if" types of questions. That's what Monte Carlo simulation is all about. In the example we are about to work through, we'll change both the mean and standard deviation of the simulated data to improve the quality of a product.

Today, simulated data is routinely used in situations where resources are limited or gathering real data would be too expensive or impractical.

How Can Monte Carlo Simulation Help You?

With Companion by Minitab, engineers can easily perform a Monte Carlo analysis in order to:

  • Simulate product results while accounting for the variability in the inputs
  • Optimize process settings
  • Identify critical-to-quality factors
  • Find a solution to reduce defects

Along the way, Companion interprets simulation results and provides step-by-step guidance to help you find the best possible solution for reducing defects. I'll show you how to accomplish all of this right now!

Step-by-Step Example of Monte Carlo Simulation

A materials engineer for a building products manufacturer is developing a new insulation product. The engineer performed an experiment and used statistics to analyze process factors that could impact the insulating effectiveness of the product. (The data for this DOE is just one of the many data set examples that can be found in Minitab’s Data Set Library.) For this Monte Carlo simulation example, we’ll use the regression equation shown above, which describes the statistically significant factors involved in the process.

Let's open Companion by Minitab's desktop app (if you don't already have it, you can try Companion free for 30 days). Open or start a new a project, then right-click on the project Roadmap™ to insert the Monte Carlo Simulation tool.

insert monte carlo simulation

Step 1: Define the Process Inputs and Outputs

The first thing we need to do is to define the inputs and the distribution of their values. The process inputs are listed in the regression output and the engineer is familiar with the typical mean and standard deviation of each variable. For the output, we simply copy and paste the regression equation that describes the process from Minitab statistical software right into Companion's Monte Carlo tool!

When the Monte Carlo tool opens, we are presented with these entry fields:

Setup the process inputs and outputs

It's an easy matter to enter the information about the inputs and outputs for the process as shown.

Setup the input values and the output equation

Verify your model with the above diagram and then click Simulate in the application ribbon.

perform the monte carlo simulation

Initial Simulation Results

After you click Simulate, Companion very quickly runs 50,000 simulations by default, though you can specify a higher or lower number of simulations. 

Initial simulation results

Companion interprets the results for you using output that is typical for capability analysis—a capability histogram, percentage of defects, and the Ppk statistic. Companion correctly points out that our Ppk is below the generally accepted minimum value of Ppk.

Step-by-Step Guidance for the Monte Carlo Simulation

But Companion doesn’t just run the simulation and then let you figure what to do next. Instead, Companion has determined that our process is not satisfactory and presents you with a smart sequence of steps to improve the process capability.

How is it smart? Companion knows that it is generally easier to control the mean than the variability. Therefore, the next step that Companion presents is Parameter Optimization, which finds the mean settings that minimize the number of defects while still accounting for input variability.

Next steps leading to parameter optimization

Step 2: Define the Objective and Search Range for Parameter Optimization

At this stage, we want Companion to find an optimal combination of mean input settings to minimize defects. After you click Parameter Optimization, you'll need to specify your goal and use your process knowledge to define a reasonable search range for the input variables.

Setup for parameter optimization

And, here are the simulation results!

Results of the parameter optimization

At a glance, we can tell that the percentage of defects is way down. We can also see the optimal input settings in the table. However, our Ppk statistic is still below the generally accepted minimum value. Fortunately, Companion has a recommended next step to further improve the capability of our process.

Next steps leading to a sensitivity analysis

Step 3: Control the Variability to Perform a Sensitivity Analysis

So far, we've improved the process by optimizing the mean input settings. That reduced defects greatly, but we still have more to do in the Monte Carlo simulation. Now, we need to reduce the variability in the process inputs in order to further reduce defects.

Reducing variability is typically more difficult. Consequently, you don't want to waste resources controlling the standard deviation for inputs that won't reduce the number defects. Fortunately, Companion includes an innovative graph that helps you identify the inputs where controlling the variability will produce the largest reductions in defects.

Setup for the sensitivity analysis

In this graph, look for inputs with sloped lines because reducing these standard deviations can reduce the variability in the output. Conversely, you can ease tolerances for inputs with a flat line because they don't affect the variability in the output.

In our graph, the slopes are fairly equal. Consequently, we'll try reducing the standard deviations of several inputs. You'll need to use process knowledge in order to identify realistic reductions. To change a setting, you can either click the points on the lines, or use the pull-down menu in the table.

Final Monte Carlo Simulation Results

Results of the sensitivity analysis

Success! We've reduced the number of defects in our process and our Ppk statistic is 1.34, which is above the benchmark value. The assumptions table shows us the new settings and standard deviations for the process inputs that we should try. If we ran Parameter Optimization again, it would center the process and I'm sure we'd have even fewer defects.

To improve our process, Companion guided us on a smart sequence of steps during our Monte Carlo simulation:

  1. Simulate the original process
  2. Optimize the mean settings
  3. Strategically reduce the variability

If you want to try Monte Carlo simulation for yourself, get the free trial of Companion by Minitab!


Making the World a Little Brighter with Monte Carlo Simulation

$
0
0

If you have a process that isn’t meeting specifications, using the Monte Carlo simulation and optimization tool in Companion by Minitab can help. Here’s how you, as a chemical technician for a paper products company, could use Companion to optimize a chemical process and ensure it consistently delivers a paper product that meets brightness standards.

paperThe brightness of Perfect Papyrus Company’s new copier paper needs to be at least 84 on the TAPPI brightness scale. The important process inputs are the bleach concentration of the solution used to treat the pulp, and the processing temperature. The relationship is explained by this equation:

Brightness = 70.37 + 44.4 Bleach + 0.04767 Temp – 64.3 Bleach*Bleach

Bleach concentration follows a normal distribution with a mean of 0.25 and a standard deviation of 0.0095 percent. Temperature also follows a normal distribution, with a mean of 145 and a standard deviation of 15.3 degrees C.

Building your process model

To assess the process capability, you can enter the parameter information, transfer function, and specification limit into Companion's straightforward interface, and instantly run 50,000 simulations.

paper brightness monte carlo simulation

Understanding your results

monte carlo simulation output

The process performance measurement (Cpk) is 0.162, far short of the minimum standard of 1.33. Companion also indicates that under current conditions, you can expect the paper’s brightness to fall below standards about 31.5% of the time.

Finding optimal input settings

Quality Companion's smart workflow guides you to the next step for improving your process: optimizing your inputs.

paramater optimzation

You set the goal—in this case, maximizing the brightness of the paper—and enter the high and low values for your inputs.

optimization dialog

Simulating the new process

After finding the optimal input settings in the ranges you specified, Companion presents the simulated results for the recommended process changes.

optimized process output

The results indicate that if the bleach amount was set to approximately 0.3 percent and the temperature to 160 degrees, the % outside of specification would be reduced to about 2% with a Cpk of 0.687. Much better, but not good enough.

Understanding variability

To further improve the paper brightness, Companion’s smart workflow suggests that you next perform a sensitivity analysis.

sensitivity analysis

Companion’s unique graphic presentation of the sensitivity analysis gives you more insight into how the variation of your inputs influences the percentage of your output that doesn’t meet specifications.

sensitivity analysis of paper brightness

The blue line representing temperature indicates that variation in this factor has a greater impact on your process than variation in bleach concentration, so you run another simulation to visualize the brightness using the 50% variation reduction in temperature.

final paper brightness model simulation

The simulation shows that reducing the variability will result in 0.000 percent of the paper falling out of spec, with a Cpk of 1.34. Thanks to you, the outlook for the Perfect Papyrus Company’s new copier paper is looking very bright.

Getting great results

Figuring out how to improve a process is easier when you have the right tool to do it. With Monte Carlo simulation to assess process capability, Parameter Optimization to identify optimal settings, and Sensitivity Analysis to pinpoint exactly where to reduce variation, Companion can help you get there.

To try the Monte Carlo simulation tool, as well as Companion's more than 100 other tools for executing and reporting quality projects, learn more and get the free 30-day trial version for you and your team at companionbyminitab,com.

Using a Value Stream Map to Find and Slay the Dragons of Process Waste

$
0
0

Dragon's treasureIn ancient times dragons were believed to be set by the gods to guard golden treasures. This is because dragons were the most fearsome creatures and would deter would-be thieves. Dragons typically lived in an underground lair or castle and would sleep on top of their gold and treasures.  They were terrifying and often depicted as large fire-breathing, scaly creatures with wings and a huge deadly spiked tail.  One blow from its tail or fire-breath meant doom for any hopeful knight trying to slay this evil beast!

Just as dragons guarded their treasure, so do process steps guard their waste and excess inventory. Like dragons, these steps lay hidden, deep in the process, and fiercely defend their territory. They defy change and are experts at diverting attention to other parts of the process. They go by names such as Over-production, Over-processing, Waiting, Rework Loops, Defects, and Excess Inventory. There are costs associated with these steps too: acquiring and storing excess raw materials, warehousing partially or fully finished inventory, spare equipment, and maintaining that equipment, to name just a few.

Knight.jpgHow do you find and then slay these process dragons? You need a knight in shining armor to come to your rescue and slay the crafty dragons! A process improvement practitioner has the right tools and techniques, and—with the help of a knowledgeable team—can generate a map that reveals just where the dragons lurk. Typically, quality professionals have been trained in the traditional DMAIC problem-solving methodology and have a trusty sidekick, such as Companion by Minitab®, to help. 

The Value Stream Map (VSM) will be one of the most useful tools for finding hidden process waste. The VSM illustrates the flow of materials and information as a product or service moves through the value stream. A value stream is the collection of all activities, both value-added and non-value added that generate a product or service required to meet customer needs.
http://support.minitab.com/en-us/companion/vsm_complete.png

A current-state value stream map identifies waste and helps you to envision an improved future state. Companion by Minitab® has an easy-to-use VSM tool and other tools that make the process improvement journey fun. As you work through the process of mapping the steps, calculating takt times and value-add ratios, use the following three tips to uncover opportunities for improvements.

  1.   By default, process shapes and inventory shapes display data on the map after you enter values for Cycle Time, VA CT, NVA CT, Changeover, Inventory and Inv Time. To display other data, use the Map >Data Display >Select and Arrange Shape Data dialog box, and drag a data field from the list to the shape. Release the mouse when you determine a location.  The red line indicates where the data will be displayed beside the shape on the map.
    http://support.minitab.com/en-us/companion/vsm_select_arrange_data_dialog.png
  2. If you prefer to hide some or all the data, you can select a shape and then choose Map > Data Display > Shape Data Labels. In this example, only the data labels are hidden. 
    Shape Labels

    To hide all the shape data, choose Map > Data Display > Shape Data.  In this example, the data labels, Cycle Time, VA CT, and Operators, and their values are hidden.
    Shape Data
     
  3. Use comments fields to take notes. Simply click on the step and use the Comments field is in the task pane on the Other tab. The comment symbol, which is circled in the image, appears above the shape.

Comments

In summary, process waste dragons are hard to find and harder to slay unless you have the appropriate problem-solving tools and techniques. Understanding the size of the pile of the gold and how much of it you can get back from the dragon are keys to engaging management and employee support. Together you can be successful at slaying those dragons. Keep in mind, however, dragons never really die – they always come back in the sequel!

To get your free 30-day trial of the Companion by Minitab® software, please go to the www.Minitab.com/Companion website. 
 

Many thanks to Dean Williams, Duke Energy for allowing me to use his ideas from the Slaying the Inventory Dragon presentation at the 2017 Lean and Six Sigma World Conference.

For Want of an FMEA, the Empire Fell

$
0
0

Don't worry about it, we'll be fine without an FMEA!by Matthew Barsalou, guest blogger

For want of a nail the shoe was lost,
For want of a shoe the horse was lost,
For want of a horse the rider was lost
For want of a rider the battle was lost
For want of a battle the kingdom was lost
And all for the want of a horseshoe nail. (Lowe, 1980, 50)

According to the old nursery rhyme, "For Want of a Nail," an entire kingdom was lost because of the lack of one nail for a horseshoe. The same could be said for the Galactic Empire in Star Wars. The Empire would not have fallen if the technicians who created the first Death Star had done a proper Failure Mode and Effects Analysis (FMEA).

A group of rebels in Star Wars, Episode IV: A New Hope stole the plans to the Death Star and found a critical weakness that lead to the destruction of the entire station. A simple thermal exhaust port was connected to a reactor in a way which permitted an explosion in the exhaust port to start a chain reaction that blew up the entire station. This weakness was known, but considered insignificant because the weakness could only be exploited by small space fighters and the exhaust port was protected by turbolasers and TIE fighters. It was thought that nothing could penetrate the defenses; however, a group of Rebel X-Wing fighters proved that this weakness could be exploited. One proton torpedo fired into the thermal exhaust port started a chain reaction that led to the station reactors and destroyed the entire battle station (Lucas, 1976).

Why the Death Star Needed an FMEA

The Death Star was designed by the engineer Bevil Lemelisk under the command of Grand Moff Wilhuff Tarkin; whose doctrine called for a heavily armed mobile battle station carrying more than 1,000,000 imperial personnel as well as over 7,000 TIE fighters and 11,000 land vehicles (Smith, 1991). It was constructed in orbit around the penal planet Despayre in the Horuz system of the Outer Rim Territories and was intended to be a key element of the Tarkin Doctrine for controlling the Empire. The current estimate for the cost of building of a Death Star is $850,000,000,000,000,000 (Rayfield, 2013).

Such an expensive, resource-consuming project should never be attempted without a design FMEA. The loss of the Death Star could have been prevented with just one properly filled-out FMEA during the design phase:

FMEA Example

The Galactic Empire's engineers frequently built redundancy into the systems on the Empire’s capital ships and space stations; unfortunately, the Death Star's systems were all connected to the main reactor to ensure that power would always be available for each individual system. This interconnectedness resulted in thermal exhaust ports that were directly connected to the main reactor.

The designers knew that an explosion in a thermal exhaust port could reach the main reactor and destroy the entire station, but they were overconfident and believed that limited prevention measures--such as turbolaser towers, shielding that could not prevent the penetration of small space fighters, and wings of TIE fighters--could protect the thermal exhaust ports (Smith, 1991). Such thinking is little different than discovering a design flaw that could lead to injury or death, but deciding to depend upon inspection to prevent anything bad from happening. Bevil Lemelisk could not have ignored this design flaw if he had created an FMEA.

Assigning Risk Priority Numbers to an FMEA

An FMEA can be done with a pencil and paper, although Minitab's Companion software for executing and reporting on process improvement has a built-in FMEA form that automates calculations, and shares data with process maps and other forms you'll probably need for your project. 

An FMEA uses a Risk Priority Number (RPN) to determine when corrective actions must be taken. RPN numbers range from 1 to 1,000 and lower numbers are better. The RPN is determined by multiplying severity (S) by occurrence (O) and detection D.

RPN = S x O x D

Severity, occurrence and detection are each evaluated and assigned a number between 1 and 10, with lower numbers being better.

Failure Mode and Effects Analysis Example: Death Star Thermal Exhaust Ports

In the case of the Death Star's thermal exhaust ports, the failure mode would be an explosion in the exhaust port and the resulting effect would be a chain reaction that reaches the reactors. The severity would be rated as 10 because an explosion of the reactors would lead to the loss of the station as well as the loss of all the personnel on board. A 10 for severity is sufficient reason to look into a redesign so that a failure, no matter how improbable, does not result in injury or loss of life.

FMEA Failure Mode Severity Example

The potential cause of failure on the Death Star would be attack or sabotage; the designers did not consider this likely to happen, so occurrence is a 3. The main control measure was shielding that would only be effective against attack by large ships. This was rated as a 4 because the Empire believed these measures to be effective.

Potential Causes and Current Controls

The resulting RPN would be S x O x D =  10 x 3 x 4 = 120. An RPN of 120 should be sufficient reason to take actions, but even a lower RPN requires a corrective action due to the high rating for severity. The Death Star's RPN may even be too low due to the Empire's overconfidence in the current controls. Corrective actions are definitely needed. 

FMEA Risk Priority Number

Corrective actions are easier and cheaper to implement early in the design phase; particularly if the problem is detected before assembly is started. The original Death Star plans could have been modified with little effort before construction started. The shielding could have been improved to prevent any penetration and more importantly, the interlinks between the systems could have been removed so that a failure of one system, such a an explosion in the thermal exhaust port, does not destroy the entire Death Star. The RPN needs to be reevaluated after corrective actions are implemented and verified; the new Death Star RPN would be 5 x 3 x 2 = 30.

FMEA Revised Metrics

Of course, doing the FMEA would have had more important impacts than just achieving a low number on a piece of paper. Had this step been taken, the Empire could have continued to implement the Tarkin Doctrine, and the Universe would be a much different place today. 

Do You Need to Do an FMEA? 

A simple truth is demonstrated by the missing nail and the kingdom, as well as the lack of an FMEA and the Death Star:  when designing a new product, whether it is an oil rig, a kitchen appliance, or a Death Star, you'll avoid many future problems by performing an FMEA early in the design phase.

About the Guest Blogger: 
Matthew Barsalou is an engineering quality expert in BorgWarner Turbo Systems Engineering GmbH’s Global Engineering Excellence department. He has previously worked as a quality manager at an automotive component supplier and as a contract quality engineer at Ford in Germany and Belgium. He possesses a bachelor of science in industrial sciences, a master of liberal studies and a master of science in business administration and engineering from the Wilhelm Büchner Hochschule in Darmstadt, Germany..
  

Would you like to publish a guest post on the Minitab Blog? Contact publicrelations@minitab.com

 

References

Lucas, George. Star Wars, Episode IV: A New Hope. New York: Del Rey, 1976. http://www.amazon.com/Star-Wars-Episode-IV-Hope/dp/0345341465/ref=sr_1_2?ie=UTF8&qid=1358180992&sr=8-2&keywords=Star+Wars%2C+Episode+IV%3A+A+New+Hope

 Opie, Iona and Opie, Peter. ed. Oxford Dictionary of Nursery Rhymes. Oxford, 1951, 324. Quoted in Lowe, E.J. “For Want of a Nail.” Analysis 40 (January 1980), 50-52. http://www.jstor.org/stable/3327327

Rayfield, Jillian. “White House Rejects 'Death Star' Petition.” Salon, January 13, 2013. Accessed 1anuary 14, 2013 from http://www.salon.com/2013/01/13/white_house_rejects_death_star_petition/

Smith, Bill. ed. Star Wars: Death Star Technical Companion. Honesdale, PA: West End Games, 1991. http://www.amazon.com/Star-Wars-Death-Technical-Companion/dp/0874311209/ref=sr_1_1?s=books&ie=UTF8&qid=1358181033&sr=1-1&keywords=Star+Wars%3A+Death+Star+Technical+Companion.

How Can a Similar P-Value Mean Different Things?

$
0
0

One highlight of writing for and editing the Minitab Blog is the opportunity to read your responses and answer your questions. Sometimes, to my chagrin, you point out that we've made a mistake. However, I'm particularly grateful for those comments, because it permits us to correct inadvertent errors. 

oppositesI feared I had an opportunity to fix just such an error when I saw this comment appear on one of our older blog posts:

You said a p-value greater than 0.05 gives a good fit. However, in another post, you say the p-value should be below 0.05 if the result is significant. Please, check it out!

You ever get a chill down your back when you realize you goofed? That's what I felt when I read that comment. Oh no, I thought. If the p-value is greater than 0.05, the results of a test certainly wouldn't be significant. Did I overlook an error that basic?  

Before beating myself up about it, I decided to check out the posts in question. After reviewing them, I realized I wouldn't need to put on the hairshirt after all. But the question reminded me about the importance of a fundamental idea. 

It Starts with the Hypothesis

If you took an introductory statistics course at some point, you probably recall the instructor telling the class how important it is to formulate your hypotheses clearly. Excellent advice.

However, many commonly used statistical tools formulate their hypotheses in ways that don't quite match. That's what this sharp-eyed commenter noticed and pointed out.

The writer of the first post detailed how to use Minitab to identify the distribution of your data, and in her example pointed out that a p-value greater than 0.05 meant that the data were a good fit for a given distribution. The writer of second post—yours truly—commented on the alarming tendency to use deceptive language to describe a high p-value as if it indicated statistical significance

To put it in plain language, my colleague's post cited the high p-value as an indicator of a positive result. And my post chided people who cite a high p-value as an indicator of a positive result. 

Now, what's so confusing about that? 

Don't Forget What You're Actually Testing

You can see where this looks like a contradiction, but to my relief, the posts were consistent. The appearance of contradiction stemmed from the hypotheses discussed in the two posts. Let's take a look. 

My colleague presented this graph, output from the Individual Distribution Identification:

Probability Plot

The individual distribution identification is a kind of hypothesis test, and so the p-value helps you determine whether or not to reject the null hypothesis.

Here, the null hypothesis is "The data follow a normal distribution," and the alternative hypothesis would be "The data DO NOT follow a normal distribution." If the p-value is over 0.05, we will fail to reject the null hypothesis and conclude that the data follow the normal distribution.

Just have a look at that p-value:

P value

That's a high p-value. And for this test, that means we can conclude the normal distribution fits the data. So if we're checking these data for the assumption of normality, this high p-value is good. 

But more often we're looking for a low p-value. In a t-test, the null hypothesis might be "The sample means ARE NOT different," and the alternative hypothesis, "The sample means ARE different." Seen this way, the value or arrangement of the hypotheses is the opposite of that in the distribution identification. 

Hence, the apparent contradiction. But in both cases a p-value greater than 0.05 means we fail to reject the null hypothesis. We're interpreting the p-value in each test the same way.

However, because the connotations of "good" and "bad" are different in the two examples, how we talk about these respective p-values appears contradictory—until we consider exactly what the null and alternative hypotheses are saying. 

And that's a point I was happy to be reminded of. 

 

Reducing the Phone Bill with Statistical Analysis

$
0
0

One of the most memorable presentations at the inaugural Minitab Insights conference reminded me that data analysis and quality improvement methods aren't only useful in our work and businesses: they can make our home life better, too. 

you won't believe how cheap my phone bill is now! The presenter, a continuous improvement training program manager at an aviation company in the midwestern United States, told attendees how he used Minitab Statistical Software, and some simple quality improvement tools, to reduce his phone bill.

He took the audience back to 2003, when his family first obtained their cell phones. For a few months, everything was fine. Then the April bill arrived, and it was more than they expected. The family had used too many minutes. 

The same thing happened again in May. In June, the family went over the number of minutes allocated in their phone plan again, for the third month in row. Something had to change!

Defining the Problem

His wife summed up the problem this way: "There is a problem with our cell phone plan, because the current minutes are not enough for the family members over the past three months." 

He wasn't sure that "too few minutes" was the real problem. But instead of arguing, he applied his quality improvement training to find common ground. He and wife agreed that the previous three months' bills were too much, and they were able to agree that the family went over the plan minutes—for an unknown reason. Based on their areas of agreement, they revised the initial problem statement: 

There is a problem with our cell phone usage, and this is known because the minutes are over the plan for the past 3 months, leading to a strain on the family budget.

They further agreed that before taking further action—like switching to a costlier plan with more minutes—they needed to identify the root cause of the overage. 

Using Data to Find the Root Cause(s) pie chart of phone usage

At this point, he downloaded the family's phone logs from their cell phone provider and began using Minitab Statistical Software to analyze the data. First, he used a simple pie chart to look at who was using the most minutes. Since he also had a work-provided cell phone, it wasn't surprising to see that his wife used 4 minutes for each minute of the family plan he used. 

Since his wife used 75% of the family's minutes, he looked more closely for patterns and insights in her call data. He created time series plots of her daily and individual call minutes, and created I-MR and Xbar-S charts to assess the stability of her calling process over time. 

I-MR chart of daily phone minutes

Xbar-S Chart of Daily Minutes Per Week

He also subgrouped calls by day of the week and displayed them in a boxplot. 

Boxplot of daily minutes used

These analyses revealed that daily minute usage did contain some "special cause variation," shown in the I-MR chart. They also showed that, compared to other days of the week, Thursdays had a greater average daily minutes and variance. 

Creating a Pareto chart of his wife's phone calls provided further insight. 

Pareto chart of number called

The Minitab analysis helped them see where and when most of their minutes were going. But as experienced professionals know, sometimes the numbers alone don't tell the entire story. So the family discussed the results to put those numbers in context and to see where some improvements might be possible.

The most commonly called number belonged to his wife's best friend, who used a different cell phone provider than the family did. This explained the Thursday calls, because every weekend his wife and her friend took turns shopping garage sales on opposite sides of town to get clothes for their children. They did their coordination on Thursday evenings.

Calls to her girlfriend could have been free if they just used the same provider, but the presenter's family didn't want to change, and it wasn't fair to expect the other family to change. But while a few calls to her girlfriend may have been costing a few dollars, the family was saving many more dollars on clothes for the kids. 

Given the complete context, this was a situation where the calls were paying for themselves, so the family moved on to the next most frequently called number: the presenter's mother's land line.

His wife spoke very frequently with his mother to arrange childcare and other matters. His mother had a cell phone from the same provider, so calls to the cell phone should be free. Why, then, was his wife calling the land line? "Because," his wife informed him, "your mother never answers her cell phone." 

Addressing the Root Cause

The next morning, the presenter visited his mother and eventually he steered the conversation to her cell phone. "I just love using the cell phone on weekends," his mother told him. "I use it to call my old friends during breakfast, and since it's the weekend the minutes are free!" 

When he asked how she liked using the cell phone during the week, his mother's face darkened. "I hate using the cell phone during the week," she declared. "The phone rings all the time, but when I answer there's never anyone on the line!"  

This seemed strange. To get some more insight, her son worked with her to create a spaghetti diagram that showed her typical movements during the weekday when her cell phone rang. That diagram, shown below, revealed two important things.

spaghetti diagram

First, it showed that his mother loved watching television during the day. But second, and more important when it came to using the cell phone, to answer her cell phone, his mother needed to get up her chair, walk into the dining room, and retrieve her cell phone, which she always kept on the table. 

Her cell phone automatically sent callers to voice mail after three rings. But it took his mother longer than three rings to get from her chair to the phone. What's more, since she never learned to use the voice mail ("Son, there is no answering machine connected to this phone!") his mother almost exclusively used the cell phone to make outgoing calls. 

Now that the real reasons underlying this major drain on the minutes in the family's cell phone plan were known, a potential solution could be devised and tested. In this case, rather than force his mother to start using voicemail, he came up with an elegant and simple alternative:  

Job Instructions for Mom:

When receiving call on weekday:

  • Go to cell phone
  • Pick up phone
  • Press green button twice
  • Wait for person who called to answer phone

After a few test runs to make sure his mother was comfortable with the new protocol, they gave the new system its first month's test run. 

The Results

Solving this problem effectively required four steps. First, the presenter and his wife needed to clearly define the problem. Second, they used statistical software to get insight into the problem from the available data. From there, a spaghetti chart and a set of simple job instructions provided a very viable solution to test. And the outcome? 

Bar Chart of Phone Bills

As the bar graph shows, July's minutes were well within their plan's allotment. In that month's Pareto chart, what had been the second-largest bar dropped to near zero. His mother enjoyed her cell phone much more, and his wife was able to arrange child care with just one call. 

And to this day, when the presenter wants to talk to his mother, he 

1. Calls her cell phone
2. Lets it ring 3 times
3. Hangs up
4. Waits for her return call

Happily, this solution turned out to be very sustainable, as the monthly minutes remained within the family's allowance and budget for quite some time...until his daughter got a cell phone, and texting issues began.

Where could you apply data analysis to get more insight into the challenges you face? 

A New Spin on the "Stand in a Circle" Exercise (Part 1)

$
0
0

In the mid 1940s, Taiichi Ohno established the Toyota Production System, which is primarily based on eliminating non-value-added waste. He discovered that by reducing waste and inventory levels, problems get exposed and that forces employees to address these problems. To engage the workers and therefore improve processes, Ohno developed many exercises.

One of his most popular exercises, “Stand in a Circle” helps his managers and students see process waste. During this exercise Ohno would take the manager or student to the shop floor, draw a chalk circle on the floor, then have them stand inside the circle and observe an operation. His direction would be simple: “Watch.”  

Several hours later, Ohno would return and ask “What do you see?” If they saw the same problem Ohno had seen, then the exercise was over. If not, he would say “Watch some more.” This would continue until they saw the same problem Ohno had seen. This exercise helped managers learn to observe waste, and thus helped organizations identify and deal with the Seven Wastes of Lean.

1. Overproduction
Producing more than what’s actually needed by the next process or customer (The worst form of waste because it contributes to the other six).

2. Waiting
Delay, waiting or time spent in a queue with no value being added.

3. Transportation
Moving parts and products unnecessarily.

4. Over-processing
Undertaking non-value-added activity

5. Inventory
Having more than the minimum.

6. Motion
Unnecessary movement or action.

7. Correction
Inspection, rework, and scrap.

I've been thinking about Ohno's famous exercise a lot since the winners of the Lean and Six Sigma Excellence Awards were announced at the 2017 Lean and Six Sigma World Conference in Nashville, Tenn.

For the second consecutive year, Arrow Electronics took the category for innovation, this time for its Lean Sigma Drones project. This project combines drone technology, proprietary video technology, and a rapid-improvement methodology to observe Arrow’s extensive warehouse operations from a birds-eye view and more effectively identify areas for continuous improvement.

This new approach—appropriately named "Fly in a Circle"—has already increased the efficiency of targeted processes by 82 percent and eliminated more than 6.5 million walking steps in warehouse processes since Arrow launched it in late 2016.

Standing (or Flying) in a Circle means you go to the Gemba and observe for yourself what is actually happening. Get the facts about what is being done; not what is supposed to be done according to the procedure. Observe every waste that you can, and write them down. Keep an open mind about your observations. Even if you know the reason behind a workaround, document it anyway—it’s still a workaround, and potentially a wasteful task. Being able to spot waste is one of the hardest parts of improving a process.

Waste Analysis by OperationFigure 1. Waste Analysis by Operation

When performing this exercise, it is easy to fall into the trap of trying to fix the waste on the spot. Instead use lean tools to thoroughly understand the process, then develop ways to eliminate the waste.  Companion by Minitab® contains professionally designed Roadmaps™ and forms that can be used to document and further diagnose the root cause of wastes. Using the Waste Analysis by Operations (Figure 1) and performing the Five Why’s on the identified waste (Figure 2) will help you document and discover ways to eliminate waste in your operations. 

5 WhysFigure 2. Five Why's

The simple exercise of Stand or Fly in a Circle will open your eyes to new ways to improve your processes by eliminating wasteful activities. As your processes and services become more effective and efficient, your customer will appreciate the improvements made in delivery, quality, and price. When an organization eliminates waste, improves quality and reduces costs, they gain a competitive advantage by responding faster and better to customer requirements and needs. 

As you prepare for your Stand or Fly in a Circle exercise, remember these inspirational words: You can observe a lot by just watching. – Yogi Berra.

If you'd like to learn more about Companion's or try the more than 100 other tools for executing and reporting on quality projects that it includes, get the free 30-day trial version for you and your team at companionbyminitab.com.

A New Spin on the "Stand in a Circle” Exercise (Part 2)

$
0
0

In Part 1 of my A New Spin on the "Stand in a Circle" Exercise blog, I described how Taiichi Ohno, the creator of the Toyota Production System, used the “Stand in a Circle” exercise to help managers identify waste in their operations. 

OhnoDuring this exercise Ohno would take a manager or student to the shop floor, draw a chalk circle on the floor, then have them stand inside the circle and observe an operation. His direction was simply, “Watch.” Several hours later Ohno would return and ask “What do you see?”  If they saw the same problem Ohno had seen, then the exercise was over. If not, he would say “Watch some more.”

This would continue until the manager or student saw the same problem Ohno had seen, thus teaching them to observe waste. Ohno developed this exercise to help organizations identify and deal with the Seven Wastes of Lean.

In this post, I’ll walk you through a "Stand in the Circle" example using Companion by Minitab®.  Suppose you are a process improvement practitioner at a company where full containers—boxes of tile grout—are transported from the processing area to the warehouse for shipping. The containers are stacked onto pallets, wrapped with poly sheeting, and transported to the warehouse to wait for shipping to the customer.

While standing in a circle in the middle of the warehouse, you notice and document several wasteful activities on the Waste Analysis by Operation form (Figure 1). 

Waste Analysis by Operation

Figure 1. Waste Analysis by Operation

The highest-priority issue is the container damage, so you'll address this one first. The containers can get damaged when being stacked on the pallets and transported to the shipping area. 

A Cause and Effect diagram (C&E) or Fishbone can be used to identify causes for an effect or problem. During a team meeting, conduct a brainstorming session to identify the causes of the container damage.  On a C&E diagram, the effect, or central problem, is on the far right. Affinities, which are categories of causes, branch from the spine of the effect and the causes branch from the affinities. The structure of the C&E Diagram will immediately sort ideas into useful categories (affinities). Use Companion’s built-in priority rating scale and color coding to identify high, medium or low priority causes to further investigate.

CandE diagram

Figure 2. Cause and Effect Diagram

Another tool to help get to the root cause of a problem is the 5 Whys line of questioning (Figure 3). By asking the question “Why?” five times, you will eventually get to the root cause of the problem and identify steps to prevent it from happening again. Both the Cause and Effect Diagram and the 5 Whys tools are best performed in a group setting with a team knowledgeable about the process.

Five Whys

Figure 3. 5 Whys Form

After solutions are identified, the team can fill out the 30-60-90 Action Plan to identify and track the long-term activities.  Using this form will help the team clearly identify:

1). What remains to be done?

2). Who is responsible?

3). When will it be done?

Action Plan

Figure 4. 30-60-90 Action Plan

As your processes and services become more effective and efficient, your customer will appreciate the improvements made in delivery, quality, and price. When an organization eliminates waste, improves quality and reduces costs, they gain a competitive advantage by responding faster and better to customer requirements and needs.  

The simple exercise of “Standing in a Circle” will open your eyes to new ways to improve your processes by eliminating wasteful activities. Using a root cause analysis tool such as the Fishbone and the 5 Whys can quickly get your team to understand the causes behind inefficient tasks. 

Once the root causes are identified, the team can get busy identifying, selecting and implementing solutions. Using a project management tool such as Companion will help keep the process improvement team organized and will keep your stakeholders and executives apprised of progress automatically.

Companion puts all of your tools in one easy-to-use application, so you'll spend less time managing projects and more time moving them forward. If you aren't already using it, you can try Companion free for 30 days.


5 Conditions that Could Put Your Quality Program on the Chopping Block

$
0
0

By some estimates, up to 70 percent of quality initiatives fail. Why do so many of improvement programs, which are championed and staffed by smart, dedicated people, ultimately end up on the chopping block?

According to the Juran Institute, which specializes in training, certification, and consulting on quality management, the No. 1 reason quality improvement initiatives fail is a lack of management support.

chopping blockAt first blush, doesn't that seem like a paradox? After all, it's company leaders who start quality improvement efforts in the first place. So what happens between the time a deployment kicks off—with the C-level's enthusiastic support and participation—and the day a disillusioned C-level executive pulls the plug on a program that never seemed to deliver on its potential?

Even projects which result in big improvements often fail to make an impression on decision-makers. Why?  

The answer may be that those C-level leaders never find out about that impact. The 2013 ASQ Global State of Quality study revealed that the higher people rise in an organization's leadership, the less often they receive reports about quality metrics. Only 2% of senior executives get daily quality reports, compared to 33% of front-line staff members.

Think that's bad? A full 25% of the senior executives reported getting quality metrics only on an annual basis

In light of findings like that, the apparent paradox of leaders losing their initial enthusiasm for quality initiatives begins to make sense. The success of the program often remains invisible to those at the top. 

That's not necessarily for a lack of trying, either. Even in organizations with robust, mature quality programs, understanding the full impact of an initiative on the bottom line can be difficult, and sometimes impossible.

For more than 45 years, Minitab has been helping companies in every industry, in virtually every country around the world, improve quality. Along the way, we've seen and identified five main challenges that can keep even the most successful deployments in the shadows.

1. Project Data Is Scattered and Inaccessible.

Individual project teams usually do a great job capturing and reporting their results. But challenges quickly arise when projects accumulate. A large company may have thousands of simultaneous quality projects active now, and countless more completed. Gathering the critical information from all of those projects, then putting it into a format that leaders can easily access and use, is an extremely daunting task—which means that many organizations simply fail to do it, and the overall impact of their quality program remains a mystery.  

2. Projects Are a Hodgepodge of Applications and Documents.

As they work through their projects, team members need to create project charters, do SIPOCs and FMEAs, evaluate potential solutions, facilitate brainstorming, and much more. In most organizations, teams have to use an assortment of separate applications for documents, process maps, value stream maps, and other essential project tools. That means the project record becomes a compilation of distinct, frequently incompatible files from many different software programs. Team members are forced to waste time entering the identical information into first one program, then another. Adding to the confusion, the latest versions of documents may reside on several different computers, so project leaders often need to track multiple versions of a document to keep the official project record current. 

3. Metrics Vary from Project to Project   

Even projects in the same department often don't treat essential metrics consistently, or don't track the same data in the same way. Multiply that across the hundreds of projects under way at any given time in an organization with many different departments and divisions, and it's not hard to see why compiling a reliable report about the impact of all these projects never happens. Even if the theoretical KPIs are consistent across an organization, when one division tracks them in apples, and the next tracks them in oranges, their results can't be evaluated or aggregated as if they were equivalent. 

4. Teams Struggle with Square-Hole Tracking Systems

Many organizations attempt to monitor and assess the impact of quality initiatives using methods that range from homegrown project databases to full-blown, extremely expensive project portfolio management (PPM) systems. Sometimes these work—at least for a while. But many organizations find maintaining their homegrown systems turns into a major hassle and expense. And as others have discovered, the off-the-shelf solutions that were created to meet the needs of information technology, finance, customer service, or other business functions don’t adequately fit or support projects that are based on quality improvement methods such as Six Sigma or Lean. The result? Systems that slowly wither as resources are directed elsewhere, reporting mechanisms that go unused, and summaries that fail to convey a true assessment of an initiative's impact even if they are used. 

5. Reporting Takes Too Much Time

There are only so many hours in the day, and busy team members and leaders need to prioritize. Especially when operating under some of the conditions described already, team leaders find reporting on projects to be a burden that just never rises to the top of the priority list. It seems like non-value-added activity to copy-and-paste information from project documents, which had to be rounded up from a bunch of different computers and servers, and then place that information into yet another format. And if the boss isn't asking for those numbers—and it appears that many C-level executives don't—most project leaders have many other tasks to which they can devote their limited time. 

How to Overcome the Challenges to Reporting on Quality

It's easy to understand why so many companies, faced with these constraints, don't have a good understanding of how their quality initiatives contribute to the overall financial picture. But recognizing the issues is the first step in fixing them. 

Organizations can establish standards and make sure that all project teams use consistent metrics. Quality professionals and their leaders can take steps to make sure that reporting on results becomes a critical step in every individual project. 

There also are solutions that tackle many of these challenges head-on. For example, Companion by Minitab takes a desktop app that provides a complete set of integrated tools for completing projects, and combines it with centralized, cloud-based storage for projects and a customizable web-based dashboard. Companion's desktop app makes it easier for practitioners to work through and finish projects—and since their project data automatically rolls up to the dashboard, reporting on projects is effortless. Literally.

For the executives, managers, and stakeholders who have never had a clear picture of their quality program, Companion opens the window on the performance, progress, and bottom-line effects of the entire quality initiative, or specific pieces of it. 

Ensuring that the results of your improvement efforts are clearly seen and understood is a challenge that every quality pro is likely to face. How do you ensure your stakeholders appreciate the value of your activities?  

 

Fundamentals of Gage R&R

$
0
0

Before cutting an expensive piece of granite for a countertop, a good carpenter will first confirm he has measured correctly. Acting on faulty measurements could be costly.

gaugeWhile no measurement system is perfect, we rely on such systems to quantify data that help us control quality and monitor changes in critical processes. So, how do you know whether the changes you see are valid and not just the product of a faulty measurement system? After all, if you can’t trust your measurement system, then you can’t trust the data it produces.

Performing a Gage R&R study can help you to identify problems with your measurement system, enabling you to trust your data and to make data-driven decisions for process improvement. 

What Can Gage R&R Do for Me?

Gage R&R studies can tell you if inconsistencies in your measurements are too large to ignore—this could be due to a faulty tool or inconsistent operation of a tool.

Reveal an inconsistent tool

Let’s look at an example to better understand how Gage R&R studies work.

Suppose a company wants to use a control chart to monitor the fill weights of cereal boxes. Before doing so, they conduct a Gage R&R study to determine if the system which measures the weight of each cereal box is producing precise measurements.

The best way to ensure that measurements are valid is to look at repeatability, or the variation of the measurements taken by the same operator for the same part. If we weigh the same cereal box under the same conditions a number of times, will we observe the same weight every time? Weighing the same box over and over again can show us how much variation exists in our measurement system.

plot

For this experiment, we can look at repeatability based on two different operators’ measurements. The Gage R&R results show that even when the same person weighs the same box on the same scale, the measurements can vary by several grams. Most likely, the scale is in serious need of recalibration. The faulty scale would have rendered a control chart for these measurements virtually useless. Although the average measurements for each operator are not far apart, the spread of the measurements is huge!

Highlight operator differences

But the variation that exists in the measurement system is just one aspect of a Gage R&R study. We must also look at reproducibility, or the variation due to different operators using the measurement system. A Gage R&R study can tell us whether a measurement differs from one operator to the next and by how much.

Suppose the same company who wishes to monitor fill weights of cereal boxes hires new employees to help record measurements. The company uses a Gage R&R to evaluate both the new operators and experienced operators.

gage R&R

The study reveals that when employees weigh the same cereal box, the measurements of new hires are too high or too low more often than the measurements of experienced employees. This finding might indicate that the company should conduct more training for the new hires.

How to Analyze a Gage R&R Study in Minitab

Awareness of how well you can measure something can have substantial financial impacts. Minitab Statistical Software makes it easy to analyze how precise your measurements are.

In the case of the company evaluating cereal box fill weights, problems of over- and under-filling have different implications. Overfilling cereal boxes is costing the company money they could be saving with a calibrated measurement system and properly trained staff. Similarly, not filling cereal boxes fully is making customers angry because they didn’t get the amount of product they paid for. 

Getting started

Preparing to analyze your measurement system is easy because Minitab’s Create Gage R&R Study Worksheet can generate a data collection sheet for you. The dialog box lets you quickly specify who takes the measurements (the operators), which item they measure (the parts), and in what order the data are to be collected.

  1. Choose Stat > Quality Tools > Gage Study > Create Gage R&R Study Worksheet.
  2. Specify the number of parts, number of operators, and the number of times the same operator will measure the same part.
  3. Give descriptive names to the parts and operators so they’re easy to identify in the output.
  4. Click OK.
The main event

After you create your data collection sheet and record the measurements you observe, you can use Gage R&R Study (Crossed) to analyze the measurements.

  1. Choose Stat > Quality Tools > Gage Study > Gage R&R Study (Crossed).
  2. In Part Numbers, enter Parts.
  3. In Operators, enter Operators.
  4. In Measurement Data, enter 'Fill Weights'.
  5. Click OK.

Gage R&R Output

The study reveals that Jordan’s measurements are lower than Pat’s or Taylor’s. In fact, the %Study Variation for our total Gage R&R is high—90.39%—indicating that our measurement system is unacceptable. Identifying and eliminating the source of the difference will improve the measurement system.

Some of my colleagues offer more information on Gage R&R tools and how to interpret the output.

Putting Gage R&R Studies to Use

Taking measurements is like any other process—it’s prone to variability. Assessing and identifying where to focus efforts for reducing this variation with Minitab’s Gage R&R tools can help you ensure your measurement system is precise. 

Leaving Out-of-control Points Out of Control Chart Calculations Looks Hard, but It Isn't

$
0
0

Houston skylineControl charts are excellent tools for looking at data points that seem unusual and for deciding whether they're worthy of investigation. If you use control charts frequently, then you're used to the idea that if certain subgroups reflect temporary abnormalities, you can leave them out when you calculate your center line and control limits. If you include points that you already know are different because of an assignable cause, you reduce the sensitivity of your control chart to other, unknown causes that you would want to investigate. Fortunately, Minitab Statistical Software makes it fast and easy to leave points out when you calculate your center line and control limits. And because Minitab’s so powerful, you have the flexibility to decide if and how the omitted points appear on your chart.

Here’s an example with some environmental data taken from the Meyer Park ozone detector in Houston, Texas. The data are the readings at midnight from January 1, 2014 to November 9, 2014. (My knowledge of ozone is too limited to properly chart these data, but they’re going to make a nice illustration. Please forgive my scientific deficiencies.) If you plot these on an individuals chart with all of the data, you get this:

The I-chart shows seven out-of-control points between May 3rd and May 17th.

Beginning on May 3, a two-week period contains 7 out of 14 days where the ozone measurements are higher than you would expect based on the amount that they normally vary. If we know the reason that these days have higher measurements, then we could exclude them from the calculation of the center line and control limits. Here are the three options for what to do with the points:

Three ways to show or hide omitted points

Like it never happened

One way to handle points that you don't want to use to calculate the center line and control limits is to act like they never happened. The points neither appear on the chart, nor are there gaps that show where omitted points were. The fastest way to do this is by brushing:

  1. On the Graph Editing toolbar, click the paintbrush.

The paintbrush is between the arrow and the crosshairs.

  1. Click and drag a square that surrounds the 7 out-of-control points.
  2. Press CTRL + E to recall the Individuals chart dialog box.
  3. Click Data Options.
  4. Select Specify which rows to exclude.
  5. Select Brushed Rows.
  6. Click OK twice.

On the resulting chart, the upper control limit changes from 41.94 parts per billion to 40.79 parts per billion. The new limits indicate that April 11 was also a measurement that's larger than expected based on the variation typical of the rest of the data. These two facts will be true on the control chart no matter how you treat the omitted points. What's special about this chart is that there's no suggestion that any other data exists. The focus of the chart is on the new out-of-control point:

The line between the data is unbroken, even though other data exists.

Guilty by omission

A display that only shows the data used to calculate the control line and center limits might be exactly what you want, but you might also want to acknowledge that you didn't use all of the data in the data set. In this case, after step 6, you would check the box labeled Leave gaps for excluded points. The resulting gaps look like this:

Gaps in the control limits and data connect lines show where points were omitted.

In this case, the spaces are most obvious in the control limit line, but the gaps also exist in the lines that connect the data points. The chart shows that some data was left out.

Hide nothing

In many cases, not showing data that wasn't in the calculations for the center line and control limits is effective. However, we might want to show all of the points that were out-of-control in the original data. In this case, we would still brush the points, but not use the Data Options. Starting from the chart that calculated the center line and control limits from all of the data, these would be the steps:

  1. On the Graph Editing toolbar, click the paintbrush.

The paintbrush is between the arrow and the crosshairs.

  1. Click and drag a square that surrounds the 7 out-of-control points.
  2. Press CTRL + E to recall the Individuals chart dialog box. Arrange the dialog box so that you can see the list of brushed points.
  3. Click I Chart Options.
  4. Select the Estimate tab.
  5. Under Omit the following subgroups when estimating parameters, enter the row numbers from the list of brushed points.
  6. Click OK twice.

This chart still shows the new center line, control limits, and out-of-control point, but also includes the points that were omitted from the calculations.

Points not in the calculations are still on the chart.

Wrap up

Control charts help you to identify when some of your data are different than the rest so that you can examine the cause more closely. Developing control limits that exclude data points with an assignable cause is easy in Minitab and you also have the flexibility to decide how to display these points to convey the most important information. The only thing better than getting the best information from your data? Getting the best information from your data faster!

The image of the Houston skyline is from Wikimedia commons and is licensed under this creative commons license.

Companion by Minitab: Desktop App and Web App Terminology (Part 1)

$
0
0

By now you have probably heard about Companion by Minitab®, our software for executing and reporting on quality improvement projects.

We've had questions about some terminology used in the product, which has two main components: the desktop application, or desktop app for short, and the web application, or web app for short, but also sometimes referred to as the full version or subscription. If you've wondered about this terminology, I hope this post will answer your questions.

In a nutshell, Companion is a software platform for managing your continuous improvement program. There are two parts to the software: the desktop and web apps. Project owners and practitioners use the Companion desktop app to execute projects. As they progress, their project information automatically rolls up to Companion’s web app dashboard, where executives and stakeholders can see graphical summaries and reports for a high-level view of the organization’s initiatives. 

Best of all, since the Companion dashboard updates automatically, team members have more time to complete critical tasks instead of creating reports or updating information in a separate tracking database. Companion’s desktop app and dashboard work together to help you not only boost the bottom line but also demonstrate your success to the proper people who need to know.

Companion Big Picture

The Companion Desktop App

Companion's desktop app provides the tools and forms that project teams and practitioners need to complete projects efficiently and consistently. This is important because using consistent methodologies, forms, and metrics will allow teams working projects to devote more of their time to working on value-added projects. 

http://support.minitab.com/en-us/companion/toolkit_annotated.png

Terminology associated with the desktop app includes:

A: Insert tab: The menu where you add phases, folders, documents, forms, and tools to your Roadmap.

B: Management section: The set of forms in a project template that contains important project data. The management section ensures consistent project definition and tracking. In the desktop app only, anyone can edit the management section. In the web app, only data architects can add, delete, and reorder forms in the management section. If you are a data architect, go to Update management forms.

C: Roadmap™: The area where you open phases, folders, documents, forms, and tools to help you organize and execute your project.

D: Workspace: The area where you view and enter data in forms and work with tools.

E: Task pane (maps and brainstorming tools only): In a process map or a value stream map, the area where you can enter shape data. In a brainstorming tool, the area where you can brainstorm a list or import X and Y variables.

The Web App

The Companion web application works in concert with the desktop app to maximize the benefits of your improvement initiative and provide unparalleled insight into its impact on KPIs and the bottom line. 

The web app is the heart of Companion and fulfills two roles: configurable dashboard to display key metrics, and centralized storage for all Companion projects and templates. The web app is a cloud-optimized platform hosted by Microsoft Azure, so you are assured of the highest security offered by Microsoft. The Microsoft Azure data centers guarantee a 99.95% uptime and meet a wide range of internationally recognized security and compliance standards.

http://support.minitab.com/en-us/companion/dashboard_report_annotated.png

The terminology associated with the web app includes:

A: Report: A collection of filters, summaries, and column sets.

B: Filters: Allow you to focus on a subset of projects, based on a condition, such as region, location, or project status.

C: Summaries: Display aggregate project data, such as the number of projects in each division, the average duration of projects, or the total project savings by quarter. Also displays optional targets.

D: Column sets: Determines the fields that are displayed for each project in the projects list.

E: Projects list: Displays a list of all projects meeting the current filter's criteria.

F: Help button: Gives you access to topics, videos, the Quick Tour, and the download link for the desktop app.

G: Actions menu: Gives you access to common tasks, such as editing, copying, and creating new reports, saving a report as a PDF, and setting default reports.

My next posts will dig deeper into the detail of both Companion's desktop app and web app. 

Everyone at Minitab is excited about the new Companion by Minitab® and hope you are too.  Companion gives your team everything it needs to streamline and standardize your process improvement program.

 For further information about Companion or to download the 30-day free trial, go to the Minitab website at http://www.minitab.com/en-us/products/companion/.

Doing Gage R&R at the Microscopic Level

$
0
0

by Dan Wolfe, guest blogger

How would you measure a hole that was allowed to vary one tenth the size of a human hair? What if the warmth from holding the part in your hand could take the measurement from good to bad? These are the types of problems that must be dealt with when measuring at the micron level.

a 10-micron fiber

As a Six Sigma professional, that was the challenge I was given when Tenneco entered into high-precision manufacturing. In Six Sigma projects “gage studies” and “Measurement System Analysis (MSA)” are used to make sure measurements are reliable and repeatable. It’s tough to imagine doing that type of analysis without statistical software like Minitab.

Tenneco, the company I work for, creates and supplies clean air and ride performance products and systems for cars and commercial vehicles. Tenneco has revenues of $7.4 billion annually, and we expect to grow as stricter vehicle emission regulations take effect in most markets worldwide over the next five years.

We have an active and established Six Sigma community as part of the “Tenneco Global Process Excellence” program, and Minitab is an integral part of training and project work at Tenneco.

Verifying Measurement Systems

Verifying the measurement systems we use in precision manufacturing and assembly is just one instance of how we use Minitab to make data-driven decisions and drive continuous improvement.

Even the smallest of features need to meet specifications. Tolerance ranges on the order of 10 to 20 microns require special processes not only for manufacturing, but also measurement. You can imagine how quickly the level of complexity grows when you consider the fact that we work with multiple suppliers from multiple countries for multiple components.

To gain agreement between suppliers and Tenneco plants on the measurement value of a part, we developed a process to work through the verification of high precision, high accuracy measurement systems such as CMM and vision.

The following SIPOC (Supplier, Input, Process, Output, Customer) process map shows the basic flow of the gage correlation process for new technology.

sipoc

What If a Gage Study Fails?

If any of the gage studies fail to be approved, we launch a problem-solving process. For example, in many cases, the Type 1 results do not agree at the two locations. But given these very small tolerance ranges, seemingly small differences can have significant practical impact on the measurement value. One difference was resolved when the ambient temperature in a CMM lab was found to be out of the expected range. Another occurred when the lens types of two vision systems were not the same.

Below is an example of a series of Type 1 gage studies performed to diagnose a repeatability issue on a vision system. It shows the effect of part replacement (taking the part out of the measurement device, then setting it up again) before each measurement and the bias created by handling the part.

For this study, we took the results of 25 measurements made when simply letting the part sit in the machine and compared them with 25 measurements made when taking the part out and setting it up again between each of 25 measurements. The analysis shows picking the part up, handling it and resetting it in the machine changes the measurement value. This was found to be statistically significant, but not practically significant. Knowing the results of this study helps our process and design engineers understand how to interpret the values given to them by the measurement labs, and give some perspective on the considerations of the part and measurement processes.

The two graphs below show Type 1 studies done with versus without replacement of the part. There is a bias between the two studies. A test for equal variance shows a difference in variance between the two methods.

Type 1 Gage Study with Replacement

Type 1 Gage Study without Replacement

As the scatterplot below illustrates, the study done WITH REPLACEMENT has higher standard deviation. It is statistically significant, but still practically acceptable.

With Replacement vs. Without Replacement

Minitab’s gage study features are a critical part of the gage correlation process we have developed. Minitab has been integrated into Tenneco’s Six Sigma program since it began in 2000.

The powerful analysis and convenient graphing tools are being used daily by our Six Sigma resources for these types of gage studies, problem-solving efforts, quality projects, and many other uses at Tenneco.

 

About the Guest Blogger:

Dan Wolfe is a Certified Lean Six Sigma Master Belt at Tenneco. He has led projects in Engineering, Supply Chain, Manufacturing and Business Processes. In 2006 he was awarded the Tenneco CEO award for Six Sigma. As a Master Black Belt he has led training waves, projects and the development of business process design tools since 2007. Dan holds a BSME from The Ohio State University and an MSME from Oakland University and a degree from the Chrysler Institute of Engineering for Automotive Engineering.

 

Would you like to publish a guest post on the Minitab Blog? Contact publicrelations@minitab.com.

See the New Features and Enhancements in Minitab 18 Statistical Software

$
0
0

It's a very exciting time at Minitab's offices around the world, because we've just announced the availability of Minitab® 18 Statistical Software.

What's new in Minitab 18?Data is everywhere today, but to use it to make sound, strategic business decisions, you need to have tools that turn that data into knowledge and insights. We've designed Minitab 18 to do exactly that. 

We've incorporated a lot of new features, made some great enhancements and put a lot of energy into developing a tool that will make getting insight from your data faster and easier than ever before, and we're excited to get feedback from you about the new release. 

The advanced capabilities we've added to Minitab 18 include tools for measurement systems analysis, statistical modeling, and Design of Experiments (DOE). With Minitab 18, it’s much easier to test how a large number of factors influence process output, and to get more accurate results from models with both fixed and random factors.

We'll delve into more detail about these features in the coming weeks, but today I wanted to give you a quick overview of some of the most exciting additions and improvements. You can also check out one of our upcoming webinars to see the new features demonstrated. Then I hope you'll check them out for yourself—you can get Minitab 18 free for 30 days.

Updated Session Window updated session window in Minitab 18

The first thing longtime Minitab users are likely to notice when they launch Minitab 18 is the enhancements we've made to the Session window, which contains the output of all your analyses. 

The Session window looks better, and also now includes the ability to:
  • Specify the number of significant digits (decimal places) in your output
  • Go directly to graphs by clicking links in the output
  • Expand and collapse analyses for easier navigation
  • Zoom in and out 
sort worksheets in Minitab 18's project manager Sort Worksheets in the Project Manager

We've also added the option to sort the worksheets in your project by title or in chronological order, so you can manage and work with your data in the Project Manager more easily.

Definitive Screening Designs

Many businesses need to determine which inputs make the biggest impact on the output of a process. When you have a lot of inputs, as most processes do, this can be a huge challenge. Standard experimental methods can be costly and time-consuming, and may not be able to distinguish main effects from the two-way interactions that occur between inputs.

That challenge is answered in Minitab 18 with Definitive Screening Designs, a type of designed experiment that minimizes the number of experimental runs required, but still lets you identify important inputs without confounding main effects and two-way interactions.

Restricted Maximum Likelihood (REML) Estimation

Another feature we've added to Minitab 18 is restricted maximum likelihood (REML) estimation. This is an advanced statistical method that improves inferences and predictions while minimizing bias for mixed models, which include both fixed and random factors.

New Distributions for Tolerance Intervals

With Minitab 18 we've made it easy to calculate statistical tolerance intervals for nonnormal data with a distributions including the Weibull, lognormal, exponential, and more.

Effects Plots for Designed Experiments (DOE)

In another enhancement to our Design of Experiments (DOE) functionality, we've added effects plots for general factorial and response surface designs, so you can visually identify significant X’s.

Historical Standard Deviation in Gage R&R

If you're doing the measurement system analysis method known as Gage R&R, Minitab 18 enables you to enter a user-specified process (historical) standard deviation in relevant calculations.

Response Optimizer for GLM

When you use the response optimizer for the general linear model (GLM), you can include both your factors and covariates to find optimal process settings.

Output in Table Format to Word and Excel

The Session window output can be imported into Word and Excel in table format, which lets you easily customize the appearance of your results.

Command Line Pane

Many people use Minitab's command line to expand the software's functionality. With Minitab 18, we've made it easy to keep commands separate from the Session output with a docked command line pane. 

Updated Version of Quality Trainer

Finally, it's worth mentioning that the release of Minitab 18 is complemented by new version of Quality Trainer by Minitab®, our e-learning course. It teaches you how to solve real-world quality improvement challenges with statistics and Minitab, and lets you refresh that knowledge anytime. If you haven't tried it yet, you can check out a sample chapter now. 

We hope you'll try the latest Minitab release!  And when you do, please be sure to let us know what you think: we love to get your feedback and input about what we've done right, and what we can make better! Send your comments to feedback@minitab.com.  

Companion by Minitab: Deep Dive into the Desktop App (Part 2)

$
0
0

Companion by Minitab® is our software for executing and reporting on quality improvement projects. It has two components, a desktop app and a web app. As practitioners use the Companion desktop app to do project work, their project information automatically rolls up to Companion’s web app dashboard, where stakeholders can see graphical summaries and reports. Since the dashboard updates automatically, teams are freed to complete critical tasks instead of creating reports or entering data in a separate system.

In this blog, I will explore the desktop app, and in a future blog, I will explore the web app.

Companion Big Picture

The Companion Desktop Application

Companion's desktop application provides tools and forms that are used by the project owners and practitioners to execute projects efficiently and consistently. Using consistent methodologies, forms, and metrics allows teams working on projects to devote more of their time to critical, value-added project tasks. 

The desktop app delivers a comprehensive set of integrated project tools, in an easy-to-use interface.

  • The Project Manager is a window that provides access to high-level project data. It also includes the Roadmap™, which shows the phases and specific tools used to organize and complete projects.
  • The workspace is where team members work with individual tools. The workspace always displays the currently active tool.

Desktop UI

The Project Manager 

The Project Manager offers instant access to project data and tools. The Management Section includes the following components:

Management Forms

Project Today:
Provides a snapshot of overall project status, health, and phases.

Project Charter:
Defines the project and its benefits, and is updated as the project progresses.

Financial Data:
Records the project’s financial impact in terms of annualized or monthly hard and soft savings.

Team Members and Roles:
Compiles contact and role information for each member of the project team. Easily imports contacts from Microsoft Outlook and from your Companion subscription user list.

Tasks:
Outlines the actions required to complete the project. Enables team leaders to identify and assign responsibilities, set priorities, and establish due dates.

Roadmap™

RoadmapsCompanion’s Roadmap™ feature gives teams a clear path to execute and document each phase of their projects.  The Companion desktop app includes predefined Roadmap™ templates based on common continuous improvement methodologies, including DMAIC, Kaizen, QFD, CDOV, PDCA, and Just Do It.  

The Roadmaps contains phases, and the phases contain the tools appropriate to each phase. However, because every project is different, users can easily add or remove tools as needed. Built-in guidance for each tool further helps practitioners complete their tasks in a timely manner. 

Since many organizations use their own methods, metrics, and KPIs, we’ve made it simple to create or customize a Roadmap™ for your organization’s unique approach to improvement. 

Powerful Project Tools, All in One Place

Companion’s desktop app includes a full set of easy-to-use tools, such as:

Insert Tool

• Value stream map

• FMEA

• Process map

• Brainstorming

• Monte Carlo simulation

• And many more

As teams add specific tools to their project file, they appear within the selected phases of a Roadmap™. You can even customize or build tools from scratch (Blank Form) for processes or methods unique to your organization.

Data sharing in forms and tools

The tools within the Companion desktop app are smart and integrated. Information you add in one tool can be used in other tools, so you only need to type it once—no more redundant entry of the same information into multiple documents and applications!

For example, as you complete a C&E Matrix, you can import the variables you previously added to a process map. And as you rate the importance of the inputs relative to the outputs in the matrix, Companion calculates the results to build a Pareto chart on the fly. You can easily create forms that include your own custom charts and calculations, too.

CE Matrix

Monte Carlo Simulation Tool

Companion by Minitab® contains a very powerful Monte Carlo simulation tool. With its easy to use interface and guided workflow, this tool helps engineers and process improvement practitioners quickly simulate product results and provides step-by-step guidance for optimization to determine best settings for process inputs that result in acceptable outputs. 

The results are easy to understand and next steps are identified. The tool includes Parameter Optimization to find the optimal settings for your input parameters to improve results and reduce defects. It also includes Sensitivity Analysis to quickly identify and quantify the factors driving variation. By using these to pinpoint exactly where to reduce variation, you can quickly get your process where it needs to be.

Monte Carlo Simulation

Companion by Minitab's desktop application is an excellent tool that can propel your projects to success. It gives you the tools for executing projects all in one place, Roadmaps to guide your teams through the appropriate problem-solving process, interconnected forms to eliminate redundant data entry—and because it automatically updates the Companion dashboard, it even makes project reporting completely effortless. Literally.

I believe Companion is the best tool on the market for efficient project execution and summarizing the project work. Why wouldn’t you want to give your people the best tools to make difficult problem solving and reporting easier?

Visit our site for more information about Companion by Minitab® or to download your 30-day free trial for your entire team.

 


A Swiss Army Knife for Analyzing Data

$
0
0

Easy access to the right tools makes any task easier. That simple idea has made the Swiss Army knife essential for adventurers: just one item in your pocket gives you instant access to dozens of tools when you need them.  

swiss army knifeIf your current adventures include analyzing data, the multifaceted Editor menu in Minitab Statistical Software is just as essential.

Minitab’s Dynamic Editor Menu

Whether you’re organizing a data set, sifting through Session window output, or perfecting a graph, the Editor menu adapts so that you never have to search for the perfect tool.

The Editor menu only contains tools that apply to the task you're engaged in. When you’re working with a data set, the menu contains only items for use in the worksheet. When a graph is active, the menu contains only graph-related tools. You get the idea.

Graphing

When a graph window is active, the Editor menu contains over a dozen graph tools. Here are a few of them.

editor menu for graphs

ADD

Use Editor > Add to add reference lines, labels, subtitles, and much more to your graphs. The contents of the Add submenu will change depending on the type of graph you're editing.

MAKE SIMILAR GRAPH

The editing features in Minitab graphs make it easy to create a graph that looks just right. But it may not be easy to reproduce that look a few hours (or a few months) later.

With most graphs, you can use Editor > Make Similar Graph to produce another graph with the same edits, but with new variables.

make similar graph dialog

 

Entering data and organizing your worksheet

When a worksheet is active, the Editor menu contains tools to manipulate both the layout and contents of your worksheet. You can add column descriptions; insert cells, columns or rows; and much more, including the items below.

VALUE ORDER

By default, Minitab displays text data alphabetically in output. But sometimes a different order is more appropriate (for example, “Before” then “After”, instead of alphabetical order). Use Editor > Column > Value Order to ensure that your graphs and other output appear the way that you intend.

ASSIGN FORMULA TO COLUMN

editor menu assign formula

You can assign a formula to a worksheet column that updates when you add or change data.

Session window

As the repository for output, the Session window is already an important component of any Minitab project, but the Editor menu makes it even more powerful. 

SHOW COMMAND LINE

For example, most users rely on menus to run analyses, but you can extend the functionality of Minitab and save time on routine tasks with Minitab macros. If you select the "Show Command Line" option, you'll see the command language generated  with each analysis, which opens the door to macro writing.

editor-menu-show-command-line

In previous versions of Minitab, the Command Line appeared in the Session window. In Minitab 18, the Command Line appears in an another pane, which keeps the Session window output clean and displays all of the commands together. The new Command Line pane is highlighted in the screen shot below:

graph with command pane

 

NEXT COMMAND / PREVIOUS COMMAND / EXPAND ALL / COLLAPSE ALL

After you run several analyses, you may have a great deal of output in your Session window. This group of items makes it easy to find the results that you want, regardless of project size.

Next Command and Previous Command will take you back or forward one step from the currently selected location in your output.

editor menu - next command, expand or collapse all

Expand All and Collapse All capitalize on a new feature in Minitab 18's redesigned Session window. Now you can select individual components of your output and choose whether to display all of the output (Expanded), or only the output title (Collapsed). Here's an example of an expanded output item:

expanded session window itemAnd here's how the same output item appears when collapsed:

collapsed session item

When you have a lot of output items in the session window, the "Collapse All" function can make it extremely fast to scroll through them and find exactly the piece of your analysis you need at any given moment. 

Graph brushing

Graph exploration sometimes calls for graph brushing, which is a powerful way to learn more about the points on a graph that interest you. Here are two of the specialized tools in the Editor menu when you are in “brushing mode”.

SET ID VARIABLES

It’s easy to spot an outlier on a graph, but do you know why it’s an outlier? Setting ID variables allows you to see all of the information that your dataset contains for an individual observation, so that you can uncover the factors that are associated with its abnormality.

CREATE INDICATOR VARIABLE

As you brush points on a graph, an indicator variable “tags” the observations in the worksheet. This enables you to identify these points of interest when you return to the worksheet.

Putting the Dynamic Menu Editor to Use

Working on a Minitab project can feel like many jobs rolled into one—data wrestler, graph creator, statistical output producer. Each task has its own challenges, but in every case you can reach for the Editor menu to locate the right tools.

 

Companion by Minitab: Deep Dive into the Web App (Part 3)

$
0
0

Companion by Minitab® is our software for executing and reporting on quality improvement projects. It consists of a desktop app, which practitioners use to do project work, and a web app, which includes a customizable dashboard that offers stakeholders up-to-the-minute graphical summaries and reports. Since the desktop app automatically updates the dashboard as teams do their work, teams are freed to complete critical tasks instead of creating reports or entering data in a separate system.

In this blog, I will explore the web app, following up on earlier posts that provided an overview of the Companion platform, and detailed features in the desktop app.

Companion Big Picture

Companion by Minitab's Web Application

Companion helps teams complete their projects faster and more consistently, while giving you and your stakeholders insight to make critical business decisions.

The focal point of Companion’s web app is a dashboard that gives you visibility into your entire program. The dashboard makes it easy to assess the progress of all projects, or just a subset. You can monitor the KPIs you need to make important business decisions or search for, open, and explore projects to see detailed activities at the individual project level. Dynamic reports can give everyone in your company access to the information you want to share, but you also can restrict access to sensitive projects and data to the appropriate people. 

The Companion web app works in concert with the desktop app to maximize the benefits of your improvement initiative and provide unparalleled insight into its impact on KPIs and the bottom line.

The web app consists of three components: the project repository, the dashboard, and the design center.

The Project Repository

Companion’s project repository is a secure, centralized storage system that houses all of your organization’s individual improvement projects and can be accessed from anywhere. The repository makes it easy for project owners and administrators to grant or revoke access rights to projects, and to include and exclude projects from dashboard reports.

Project List

Companion's Project Repository

In addition, you can use filters to easily display all projects, projects you own, or projects that have been shared with you, making it easy for you to find projects you are a part of.

Project Filter

Project Filters

The Dashboard

Companion’s dashboard draws on the data from projects stored in the repository to provide a dynamic graphical summary of your program. It can show you financial summaries, status reports, project impacts, progress toward set targets, and more. View your entire initiative, or select and focus on specific projects, teams, or divisions. You can access the dashboard wherever and whenever you need to, from any Internet-connected computer, tablet or device.

The components and features of the dashboard are shown and detailed below:

http://support.minitab.com/en-us/companion/dashboard_report_annotated.png

A.  Report:  A collection of filters, summaries, and column sets.

B. Filters: Allows you to focus on a subset of projects, based on a condition, such as region, location, or project status.

C. Summaries: Displays aggregate project data, such as the number of projects in each division, the average duration of projects, or the total project savings by quarter. Also displays optional targets.

D. Column Set: Determines the fields that are displayed for each project in the project list.

E. Project List: Displays a list of all projects meeting the current filter’s criteria.

F: Help:  Gives you access to topics, videos, the Quick Tour, and the download link for the desktop app.

G: Action Menu: Gives you access to common tasks, such as editing, copying, and creating new reports, saving a report as a PDF, and setting default reports.


Tailor-made Reports

You can create an unlimited number of dashboard reports on different aspects of your initiative. Create reports that include only projects from specific facilities, as well as reports that summarize information from across the organization. Any report can deliver as much or as little detail as needed. 

Reports can be public and visible to everyone in your subscription, or they can be private and visible only to you. Icons to the right of the dashboard title indicate if the report is private or public, as shown.

    Report Icons


 

 


 

 

The Design Center

When you deploy Companion, your data architects customize your subscription to reflect your improvement methodology. But as organizations and processes evolve, so will your needs. Companion’s design center makes it easy to edit and create templates, forms, tools, and data fields.

Dashboard Examples

 

The design center automatically tracks the changes you make, so you know what was changed and when. The data architects work in the sandbox (Figure 7.), a safe and risk-free environment to make changes to the web features. Best of all, even while your data architect is updating project templates, data definitions, and forms, there is zero downtime for your users. 

Sandbox

Companion's Sandbox

The cloud-based web app is hosted by Microsoft Azure with automatic daily, weekly and monthly backups to safeguard your data using the latest methods.  Microsoft Azure data centers guarantee a 99.95% uptime and meet a wide range of internationally recognized security and compliance standards.

Companion deploys quickly—your entire organization can be up and running in a matter of days. Easy-to-customize roadmaps and templates ensure teams follow your company’s methods and provide the information you need.  Companion is the best solution for managing, understanding, and sharing the impact of your continuous improvement program.

Why wouldn’t you want to give your people the best tools to make difficult problem solving and reporting easier?

For more information about Companion by Minitab® or to download your 30-day free trial, please visit our website at http://www.minitab.com/products/companion/

 

Need to Validate Minitab per FDA Guidelines? Get Minitab's Validation Kit

$
0
0

Last week I was fielding questions on social media about Minitab 18, the latest version of our statistical software. Almost as soon as the new release was announced, we received a question that comes up often from people in pharmaceutical and medical device companies:

pills"Is Minitab 18 FDA-validated?"

How Software Gets Validated

That's a great question. To satisfy U.S. Food and Drug Administration (FDA) regulatory requirements, many firms—including those in the pharmaceutical and medical device industries—must validate their data analysis software. That can be a big hassle, so to make this process easier, Minitab offers a Validation Kit.

We conduct extremely rigorous and extensive internal testing of Minitab Statistical Software to assure the numerical accuracy and reliability of all statistical output. Details on our software testing procedures can be found in the validation kit. The kit also includes an automated macro script to generate various statistical and graphical analyses on your machine. You can then compare your results to the provided output file that we have validated internally to ensure that the results on your machine match the validated results.

Intended Use

FDA regulations state that the purchaser must validate software used in production or as part of a quality system for the “intended use” of the software. FDA’s Code of Federal Regulations Title 21 Part 820.70(i) lays it out:

“When computers or automated data processing systems are used as part of production or the quality system, the manufacturer shall validate computer software for its intended use according to an established protocol.”

FDA provides additional guidance for medical device makers in Section 6.3 of “Validation of Automated Process Equipment and Quality System Software” in the Principles of Software Validation; Final Guidance for Industry and FDA Staff, January 11, 2002.

“The device manufacturer is responsible for ensuring that the product development methodologies used by the off-the-shelf (OTS) software developer are appropriate and sufficient for the device manufacturer's intended use of that OTS software. For OTS software and equipment, the device manufacturer may or may not have access to the vendor's software validation documentation. If the vendor can provide information about their system requirements, software requirements, validation process, and the results of their validation, the medical device manufacturer can use that information as a beginning point for their required validation documentation.”

Validation for intended use consists of mapping the software requirements to test cases, where each requirement is traced to a test case. Test cases can contain:

  • A test case description. For example, Validate capability analysis for Non-Normal Data.
  • Steps for execution. For example, go to Stat > Quality Tools > Capability Analysis > Nonnormal and enter the column to be evaluated and select the appropriate distribution.
  • Test results (with screen shots).
  • Test pass/fail determination.
  • Tester signature and date.
An Example

There is good reason for the “intended use” guidance when it comes to validation. Here is an example:

Company XYZ is using Minitab to estimate the probability of a defective part in a manufacturing process. If the size of Part X exceeds 10, the product is considered defective. They use Minitab to perform a capability analysis by selecting Stat > Quality Tools > Capability Analysis > Normal.

In the following graph, the Ppk (1.32) and PPM (37 defects per million) are satisfactory.

Not Validated for Non-Normal Capability Analysis

However, these good numbers would mislead the manufacturer into believing this is a good process. Minitab's calculations are correct, but this data is non-normal, so normal capability analysis was the wrong procedure to use.

Fortunately, Minitab also offers non-normal capability analysis. As shown in the next graph, if we choose Stat > Quality Tools > Capability Analysis > Nonnormal and select an appropriate distribution (in this case, Weibull), we find that the Ppk (1.0) and PPM (1343 defects per million) are actually not acceptable:

Validated for Non Normal Capability Analysis

Thoroughly identifying, documenting, and validating all intended uses of the software helps protect both businesses that make FDA-regulated products and the people who ultimately use them.

Software Validation Resources from Minitab

To download Minitab's software validation kit, visit http://www.minitab.com/support/software-validation/

In addition to details regarding our testing procedures and a macro script for comparing your results to our validated results, the kit also includes software lifecycle information.

Additional information about validating Minitab relative to the FDA guideline CFR Title 21 Part 11 is available at this link:

http://it.minitab.com/support/answers/answer.aspx?id=2588

If you have any questions about our software validation process, please contact us.

Making Steel Even Stronger with Monte Carlo Simulation

$
0
0

If you have a process that isn’t meeting specifications, using Monte Carlo simulation and optimization can help. Companion by Minitab offers a powerful, easy-to-use tool for Monte Carlo simulation and optimization, and in this blog we'll look at the case of product engineers involved in steel production for automobile parts, and how they could use Companion to improve a process.

steel productionThe tensile strength of Superlative Auto Parts’ new steel parts needs to be at least 600 MPa. The important inputs for this manufacturing process are the melting temperature of the steel and the amount of carbon, manganese, cobalt, and phosphorus it contains. The following transfer equation models the steel’s tensile strength:

Strength = -1434 + 1.1101*MeltTemp + 1495*Carbon + 174.3*Manganese - 7585*Cobalt - 3023*Phosphorus

Building your process model

To assess the process capability, you can enter information about your current process inputs into Companion’s straightforward interface.

Suppose that while you know most of your inputs follow a normal distribution, you’re not sure about the distribution of melting temperature. As long as you have data about the process, you can just select the appropriate column in your data sheet and Companion will recommend the appropriate distribution for you.

determining distribution from data

In this case, Companion recommends the Weibull distribution as the best fit and then automatically enters the "MeltTemp" distribution information into the interface.

companion monte carlo tool - define model

Once you have entered all of your input settings, your transfer equation, and the lower specification limit, Companion completes 50,000 simulations for the steel production.

Understanding your results initial monte carlo simulation results

The process performance measurement (Cpk) for your process is 0.417, far short of the minimum standard of 1.33. Companion also indicates that under current conditions, 14 percent of your parts won’t meet the minimum specification.

Finding optimal input settings

The Companion Monte Carlo tool’s smart workflow guides you to the next step for improving your process: optimizing your inputs.

paramater optimization guidance

You set the goal—maximizing the tensile strength—and enter the high and low values for your inputs. Companion does the rest.

paramater optimization dialog Simulating the new process

After finding the optimal input settings in the ranges you specified, Companion presents the simulated results for the recommended process changes.

monte carlo simulation of tensile strength

The simulation indicates that the optimal settings identified by Companion will virtually eliminate out-of-spec product from your process, with a Cpk of 1.56—a vast improvement that exceeds the 1.33 Cpk standard. Thanks to you, Superlative Auto Parts’ steel products won’t be hitting any bumps in the road.

Getting great results

Figuring out how to improve a process is easier when you have the right tool to do it. With Monte Carlo simulation to assess process capability and Parameter Optimization to identify optimal settings, Companion can help you get there. And with Sensitivity Analysis to pinpoint exactly where to reduce variation, you can further improve your process and get the product results you need.

To try the Monte Carlo simulation tool, as well as Companion's more than 100 other tools for executing and reporting quality projects, learn more and get the free 30-day trial version for you and your team at companionbyminitab,com.

5 Tips to Make Process Improvements Stick!

$
0
0

For a process improvement practitioner, finishing the Control Phase of the DMAIC process is your ticket to move on to your next project. You’ve done an excellent job leading the project team because they identified root causes, developed and implemented solutions to resolve those root causes, put a control plan in place and transitioned the process back to the Process Owner. Soon, however, you learn that the process has reverted to its original state.

I’ve often heard project leaders lament, “We worked so hard to identify and implement these solutions—why won’t they stick?”

So let's talk about fishing for a moment, because it offers some great lessons for making process change. Remember the quote, “Give a man a fish, and you feed him for a day. Teach a man to fish, and you feed him for a lifetime?” Seems simple enough, right?  But what is involved and how long does it take to teach people to fish so they could eat for a lifetime?  

The same is true for process improvements. Seems simple enough to make a change and expect it to stick. So why is it so hard?

catch a fishThe fishing analogy hits home with me. I love to go fishing and have been an avid angler since I was young. And though it’s been a while since I taught my kids how to fish, I do remember it was a complicated process. There is a lot to learn about fishing—such as what type of equipment to use, rigging the rod, baiting the hook, deciding where to fish, and learning how to cast the line.

One of the most important fishing tips I can offer a beginner is that it's better to go fishing five times in a few weeks as opposed to five times in an entire year. Skills improve quickly with a focused effort and frequent feedback. People who spread those introductory fishing experiences out over a year wind up always starting over, and that can be frustrating. While there are people who are naturally good at fishing and catch on (pun intended) right away, they are rare. My kids needed repeated demonstrations and lots of practice, feedback and positive reinforcement before they were able to fish successfully. Once they started catching fish, their enthusiasm for fishing went through the roof!

Tips for Making Process Improvements Stick

Working with teams to implement process change is similar. Most workers require repeated demonstrations, lots of practice, written instructions, feedback and positive reinforcement before the new process changes take hold.  

Here are several tips you can use to help team members be successful and implement process change more quickly. Take the time to design your solution implementation strategy and control plan with these tips in mind. Also, Companion by Minitab® contains several forms that can make implementing these tips easy.

Tip #1: Pilot the Solution in the Field

A pilot is a test of a proposed solution and is usually performed on a small scale. It's like learning to fish from the shore before you go out on a boat in the ocean with a 4-foot swell. It is used to evaluate both the solution and the implementation of the solution to ensure the full-scale implementation is more effective. A pilot provides data about expected results and exposes issues with the implementation plan. The pilot should test both if the process meets your specifications and the customer expectations. First impressions can make or break your process improvement solution. Test the solution with a small group to work out any kinks. A smooth implementation will help the workers accept the solution at the formal rollout.   Use a form like the Pilot Scale-Up Form (Figure 1) to capture issues that need resolution prior to full implementation.  

Pilot
Figure 1. Pilot Scale-Up Form

Tip #2: Implement Standard Work

Standard work is one of the most powerful but least used lean tools to maintain improved process performance. By documenting the current best practice, standardized work forms the baseline for further continuous improvement. As the standard is improved, the new standard becomes the baseline for further improvements, and so on.

Use a Standard Work Combination Chart (Figure 2) to show the manual, machine, and walking time associated with each work element. The output graphically displays the cumulative time as manual (operator controlled) time, machine time, and walk time. Looking at the combined data helps to identify the waste of excess motion and the waste of waiting.

Standard Work
Figure 2. Standard Work Combination Chart

Tip #3: Update the Procedures

A Standard Operation Procedure (SOP) is a set of instructions detailing the tasks or activities that need to take place each time the action is performed. Following the procedure ensures the task is done the same way each time. The SOP details activities so that a person new to the position will perform the task the same way as someone who has been on the job for a longer time.

When a process has changed, don’t just tell someone of the change: legitimize the change by updating the process documentation. Make sure to update any memory-jogger posters hanging on the walls, and the cheat sheets in people’s desk drawers, too. Including a document revision form such as Figure 3 in your control plan will ensure you capture a list of procedures that require updating. 

Document Revision
Figure 3. Document Revision Form

Tip #4: Feedback on New Behaviors Ensures Adoption

New processes involve new behaviors on the part of the workers. Without regular feedback and positive reinforcement, new process behaviors will fade away or revert to the older, more familiar ways of doing the work. Providing periodic feedback and positive reinforcement to those using the new process is a sure-fire way to keep employees doing things right. Unfortunately, it’s easy for managers to forget to provide this feedback. Using a Process Behavior Feedback Schedule like Figure 4 below increases the chance of success for both providing the feedback and maintaining the gains.

Process BehaviorFigure 4. Process Behavior Feedback Schedule

Tip #5: Display Metrics to Reinforce the Process Improvements

Metrics play an integral and critical role in process improvement efforts by providing signs of the effectiveness and the efficiency of the process improvement itself. Posting “before and after” metrics in the work area to highlight improvements can be very motivating to the team.   Workers see their hard work paying off, as in Figure 5. It is important to keep the metric current because it will be one of the first indicators if your process starts reverting. 

Before After ChartFigure 5. Before and After Analysis

 

Kids Fishing

When it comes to fishing and actually catching fish, practice, effective feedback, and positive reinforcement makes perfect.

The same goes for implementing process change. If you want to get past the learning curve quickly, use these tips and enjoy the benefits of an excellent process! 

To access these and other continuous improvement forms, download the 30-day free trial of Companion from the Minitab website at http://www.minitab.com/products/companion/

Viewing all 403 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>