GradientOne
  • Home
  • Solutions
    • Overview
    • Test Engineering
    • Compliance Labs
    • Product Features
    • Supported Instruments
  • Documentation
    • White Papers
    • Getting Started
    • Quality Analysis of Test Data
    • Rigol Automation
    • Waveform Upload
    • Visualization & Measurements
    • News
    • Case Study
  • Try For Free
  • Pricing
    • Buy Now
    • Pilot
  • RIGOL
  • Blog

Blog

Instrument Discovery

2/23/2018

0 Comments

 
A common challenge I see in many lab environments is a limited understanding of what test equipment exists in the lab.  In the best case scenario, someone maintains a spreadsheet containing instrument model, vendor, serial number, calibration date, etc.  But let's face it:  maintaining an Excel file with what equipment is being used, what is gathering dust, what is being rented, what is being loaned out to a partner, etc.... is hard to keep up with, and will likely be inaccurate in the due course of time.  
Picture
What is the best way to track these assets?
GradientOne developed a feature to automate discovery and utilization of test equipment (you can read about our utilization feature here).   Customers simply install our Discovery agent on the same network as their lab.  Our Discovery agent subsequently monitors traffic and when it senses a new piece of test equipment on the network, it characterizes it, uploads information to the customer's GradientOne web portal, and allows the user to register the device for tracking and utilization.

Take a look at the below video that shows our Discovery feature in action.  

​
0 Comments

TEST RIG Utilization For The Lab AND Manufacturing Floor

2/11/2018

3 Comments

 
Picture

View Utilization By Lab Location, Type, Product

Schedule Use Of Test Systems To Optimize Ops
​

Use Trend Information To Plan New Purchases


      Optimal utilization of test systems drives more cost effective operations in the test lab and improves visibility for budgeting and future capital expenditures.  The challenge is that many lab environments aren’t able to easily collect data, track usage, and make the necessary steps to implement data driven tools to help balance usage and optimize their lab.  GradientOne’s Test Rig Tracker & Scheduler is designed to help customers make decisions to improve their testing throughput while managing spend on test infrastructure.
      
Our approach to building this solution is guided by three basic principles:
  1. Simple deployment
  2. Do not disrupt the Test Engineer’s existing workflow
  3. Kill multiple birds with one stone:  help both the engineering and finance team​
      This blog post will provide background into how it works, how to use it, and the various problems it can solve for your test infrastructure.

​
How It Works
      Getting started is simple.  An engineer logs into their GradientOne web interface (Figure 1), registers a Test Rig with the following information:
  • Test Rig Name
  • Test Rig Type
  • Location
  • Lab
  • Description
All of this information is indexed and made available for tracking, reporting, and trend analysis.

Picture
Figure 1
After a Test Rig is registered, the GradientOne Agent is provisioned and installed on the Test Rig.  The Agent operates in the background.  It activates itself upon Test Rig bootup and reports to the GradientOne cloud platform usage of the test rig and provides integration into the scheduling system, with no involvement required by the Test Engineer.
Picture
Figure 2

The Summary page (Figure 2) provides a global, comprehensive view of all Test Asset utilization information.  Filter and sort to customize views (Figure 3) based off test rig type, location, product line usage, and more.
Picture
Figure 3
View Test System utilization trends to plan for future purchases.  Figure 4 shows increasing weekly utilization of a test system, alerting Engineering management of the need to procure a new system, aligning capital expense with business demand.
Picture
Figure 4
Picture
Figure 5
Scheduling is integrated to book Test Rig use for your teams.  Test Engineers can check in/checkout test rigs.  Test Labs can book Test Rig time and allocate it to specific customer engagements.
3 Comments

PCA with Recipes

2/7/2018

0 Comments

 
I recently bought a berry crumble from Wal-Mart that didn't live up to my expectations: it had way too much sugar and no oatmeal or cake batter, so the entire crumble went into the compost after a few bites. Was it that I had confused crumble with cobbler? In order to prevent wasting another 5 dollars in the future, I did what any data scientist would do: run Principal Component Analysis on text-mined recipes from the internet. I formatted the data into a table, where the first column is whether either crumble or cobbler appears in the title, and then each column is either 1 or 0 for whether the word in the header appears in the recipe's text. I uploaded and ran PCA as described in a previous post. The scatterplot of component 1 vs component 2 looks like:
Picture
From the scatterplot, there's no clear distinction between the two, except for the cluster of cobblers on the lower right-hand side. The first component is able to separate a few cobblers from the rest of the recipes using the equation: 

pca1 = drink*0.120 +  alcoholic*0.103 + quail*0.081 + christmas*0.072 + liqueur*0.063 

Turns out there is a type of cocktail called a "cobbler", and that is what this first component is successfully separating out of the recipe set. The next component is: 

pca2 = bake*0.353 + fruit*0.333 + gourmet*0.307 + dessert*0.217

This component is pulling out that cobblers are slightly more likely to be baked than crumbles, and are more likely to contain fruit (as opposed to vegetables).  However, unlike the alcoholic cobblers, there's no clear dilineation between the two. But now that I've written the script for grabbing recipes and creating these tables, why not attempt to ask another thing I've wondered about - what is the difference between a lunch food and a breakfast food? Doing the same process, the scatterplot looks like:
Picture
It's possible to separate a lot of the lunch recipes from the breakfasts, but almost all the breakfasts overlap with the lunch space and so could be considered lunches. The first component is:

pca1 = oyster*0.390 + shrimp*0.365 + low cholesterol*0.240 + prune*0.208+ celery*0.202 + dried fruit*0.202 + parsley*0.149 + seafood*0.113 

So seafood and low-cholesterol are indicators of a recipe being a lunch food and not a breakfast food. This seems right; eggs are not a low-cholesterol food, and there are few breakfast foods that involve seafood. Unlike in the crumble vs cobbler example, this first component explains far more of the variance than the next components. 

PCA can explain similar questions across many domains. Instead of identifying the groups of ingredients that define one different types of food, you might be identifying the common words in the human-written feedback and descriptions of failures. Or, you might be looking at the space of all descriptions of other products to find new potential products, or products that may be most familiar. For example, we might take our knowledge that breakfasts don't have seafood to open a seafood restaurant that is open in the morning in order to take advantage of an untapped market.
0 Comments

    Archives

    April 2020
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    September 2017
    August 2017
    July 2017
    June 2017
    April 2017
    July 2016

    Categories

    All
    Instrument
    Usability

    RSS Feed


​Home
News
Jobs
​
Privacy Policy
Contact Us
OUR MISSION - 
GradientOne’s mission is to improve the work of engineers, manufacturing organizations, technical support teams, and scientists, by using cloud computing to streamline instrument management, data collection, analysis, reporting and search.
        ©2019 GradientOne Inc.  All Rights Reserved.
  • Home
  • Solutions
    • Overview
    • Test Engineering
    • Compliance Labs
    • Product Features
    • Supported Instruments
  • Documentation
    • White Papers
    • Getting Started
    • Quality Analysis of Test Data
    • Rigol Automation
    • Waveform Upload
    • Visualization & Measurements
    • News
    • Case Study
  • Try For Free
  • Pricing
    • Buy Now
    • Pilot
  • RIGOL
  • Blog