Evaluation_Engineering_October_2020 - 29

Fail Fast, Fail Early
Products should not fail; applications
should not crash. That is what we all want.
However, the reality is sometimes they
do fail and crash. The goal is to fail early
and avoid false failures (e.g., false positive,
false negative, etc.) because the longer it
takes to identify a failure, the longer it
takes to fix and the more it costs.
For example, correlation analysis of test
failures and test times can force test sequence changes so high potential failures
occur earlier and faster. This will reduce
time wasted running through most of a
test sequence before discovering a faulty
unit. Furthermore, retest procedures can
be optimized to only execute required
tests, versus re-running a full test.
Another consideration is to eliminate
tests that never fail or are obsolete due to
product patches. Tests may be completely avoided if anomalies are detected in
data from upstream assembly processes.
Eliminating any source of process failure-within the test process as well as
looking holistically across the production process-is key. In today's complex
business environment, where things are
constantly changing, speed of execution
must be balanced with optimal execution.

Optimize Throughput
Automated test systems generate tons of
data from test sequences, parametric test
results, and complex measurements. Over
time, automated data analysis of trends
can deliver the insights you need to optimize your throughput and eliminate
bottlenecks. For manufacturers, production throughput can mean the difference
between meeting delivery schedules and
losing customers to the competition.
Don't confuse output with throughput.
Output is your total production, including
scrap, rejections, and stockpiled products,
while throughput only counts the parts
that are successfully delivered and accepted by the customer.
One of the most effective ways to improve throughput in manufacturing is to
analyze your test process for bottlenecks
carefully. This includes, but is not limited
to, understanding test station and instrument utilization to determine if there are
opportunities to increase throughput by

parallelizing tests or redistributing existing assets.
Analyzing unplanned downtime due to
improper hardware configuration, software versions, or equipment calibration
kills manufacturing productivity and efficiency. It is essential to have a system
in place to track overall system health
and proactively notify test engineers and
technicians of any imminent issues with
test equipment to plan for maintenance
and repair properly. Once identified, you
can eliminate the bottlenecks, so they no
longer cause throughput delays.

Smarter Testing Through
Data Analytics
The foundation to fail fast and optimizing
throughput is data. But it goes beyond
test results data. Every process generates
data-from test station health and operator indicators to test status and performance results-that can be used to your
advantage to identify patterns in your
process that introduces inefficiencies. As
an example, tracking asset utilization and
availability data in real-time can optimize
capital expenditures while making your
organization more agile and responsive
to business needs. If you take it one step
further and start utilizing historical data,
then you can identify patterns and determine where you see process variations.
This can lead to insights that tell you
things like where you have most unplanned downtime and what the trends
leading up to that looks like, if additional
training of operators is needed, and how
your capacity has changed over time and
where it is going in the future. To truly
excel in using your data, you must bring
together design engineers, test engineers,
and data scientists to identify and implement analytics or machine learning algorithms to provide data-driven decision
making and smarter testing. This is an
iterative process as you determine the
impact on key business metrics.

Getting Started
As market dynamics increase the pressure for manufacturing to deliver new
products faster at lower costs, the status
quo is no longer a viable option. Smarter
testing through data analytics can

provide a competitive advantage. But how
do you start? As with any big initiative,
you need to start small because boiling
the ocean never works. But the right proof
of concept can provide a key transformative example for your organization.
First, identify a few key business metrics to improve by quantifying costs
and inefficiencies of current processes.
Smarter testing can obviously increase
throughput and yield, but it can also reduce scrap costs, decrease lead times and
inventory, and improve gross margins.
Next, assess the current state of accessing data in terms of quality, frequency,
and accessibility.
Many organizations have highly disparate data sources and almost no visibility
to test equipment compliance and utilization. Data analytics is only as good as the
quality of data, so it is critical to deploy
test operations management software to
centrally manage test assets, enforce data
standardization, and create a robust data
pipeline. The key is to move from inflexible
platforms and workflows to "permanently
agile" ones.

Operations Management Software
NI SystemLink is an example of test operations management software to connect
your test environment to utilize test and
measurement data to uncover actionable
insights. SystemLink provides the visibility you need to remove bottlenecks across
the test workflow, from pre-test coordination and preparation and automated
test execution to post-test analysis and
proactive action.
The growing trends of big data and
machine learning have already disrupted
consumer markets. Amazon and Netflix
disrupted traditional brick and mortar
companies with the internet, but their
use of big data analytics to serve customers faster and smarter creates a
significant competitive advantage. Test
organizations have the same opportunity to leverage the untapped potential
of test data to evolve from a necessary
cost center to a significant competitive
advantage. The question you need to ask
yourself: will you be leading this revolution or become a casualty?

OCTOBER 2020 EVALUATIONENGINEERING.COM

29


http://www.EVALUATIONENGINEERING.COM

Evaluation_Engineering_October_2020

Table of Contents for the Digital Edition of Evaluation_Engineering_October_2020

Instrumentation vs. Telemetry
By the Numbers
Automotive Test
Industry Report
Automated Test
Tech Focus
Testing Quandary
Signal Generators
Autonomous Vehicles
Automotive Test
Instrumentation vs. Telemetry
Instrumentation vs. Telemetry
Automated Test
By the Numbers
By the Numbers
High-Speed Digital
Industry Report
Industry Report
Testing Quandary
Tech Focus
Featured Tech
Autonomous Vehicles
Evaluation_Engineering_October_2020 - 1
Evaluation_Engineering_October_2020 - 2
Evaluation_Engineering_October_2020 - 3
Evaluation_Engineering_October_2020 - By the Numbers
Evaluation_Engineering_October_2020 - 5
Evaluation_Engineering_October_2020 - Industry Report
Evaluation_Engineering_October_2020 - Tech Focus
Evaluation_Engineering_October_2020 - Signal Generators
Evaluation_Engineering_October_2020 - 9
Evaluation_Engineering_October_2020 - 10
Evaluation_Engineering_October_2020 - 11
Evaluation_Engineering_October_2020 - 12
Evaluation_Engineering_October_2020 - 13
Evaluation_Engineering_October_2020 - 14
Evaluation_Engineering_October_2020 - 15
Evaluation_Engineering_October_2020 - Automotive Test
Evaluation_Engineering_October_2020 - 17
Evaluation_Engineering_October_2020 - 18
Evaluation_Engineering_October_2020 - 19
Evaluation_Engineering_October_2020 - 20
Evaluation_Engineering_October_2020 - 21
Evaluation_Engineering_October_2020 - 22
Evaluation_Engineering_October_2020 - 23
Evaluation_Engineering_October_2020 - Automated Test
Evaluation_Engineering_October_2020 - 25
Evaluation_Engineering_October_2020 - 26
Evaluation_Engineering_October_2020 - 27
Evaluation_Engineering_October_2020 - High-Speed Digital
Evaluation_Engineering_October_2020 - 29
Evaluation_Engineering_October_2020 - Testing Quandary
Evaluation_Engineering_October_2020 - 31
Evaluation_Engineering_October_2020 - Featured Tech
Evaluation_Engineering_October_2020 - 33
Evaluation_Engineering_October_2020 - Autonomous Vehicles
Evaluation_Engineering_October_2020 - 35
Evaluation_Engineering_October_2020 - 36
https://www.nxtbook.com/endeavor/evaluationengineering/novemberdecember2020
https://www.nxtbook.com/endeavor/evaluationengineering/Evaluation_Engineering_October_2020
https://www.nxtbook.com/endeavor/evaluationengineering/september2020
https://www.nxtbook.com/endeavor/evaluationengineering/August_2020
https://www.nxtbook.com/endeavor/evaluationengineering/july2020
https://www.nxtbook.com/endeavor/evaluationengineering/mayjune2020
https://www.nxtbook.com/endeavor/evaluationengineering/april2020
https://www.nxtbook.com/endeavor/evaluationengineering/march2020
https://www.nxtbook.com/endeavor/evaluationengineering/february2020
https://www.nxtbook.com/endeavor/evaluationengineering/january2020
https://www.nxtbook.com/endeavor/evaluationengineering/december2019
https://www.nxtbook.com/endeavor/evaluationengineering/november2019
https://www.nxtbook.com/endeavor/evaluationengineering/october2019
https://www.nxtbook.com/endeavor/evaluationengineering/september2019
https://www.nxtbook.com/endeavor/evaluationengineering/august2019
https://www.nxtbook.com/endeavor/evaluationengineering/july2019
https://www.nxtbook.com/endeavor/evaluationengineering/june2019
https://www.nxtbook.com/endeavor/evaluationengineering/may2019
https://www.nxtbook.com/endeavor/evaluationengineering/april2019
https://www.nxtbook.com/endeavor/evaluationengineering/march2019
https://www.nxtbook.com/endeavor/evaluationengineering/february2019
https://www.nxtbookmedia.com