One of the many joys from being a part of the Product of the Year competition is the effort to make the judging for all categories as objective as possible – and nowhere is this more evident that in the Output device categories. A quick step back to make sure that we are all on the same page. Product of the Year has been running at the former SGIA Expo (now PRINTING United) for a number of years – at least since 2009 and maybe even a bit before then. And in the Product of the Year competition there are two main categories – output device and non-output device. Output device is exactly what you think it is – a way to have printing devices compete in sub-categories (think UV Flatbed, and Solvent/Latex roll-to-roll) against each other for the bragging rights of Product of the Year in each of these sub-categories.
This is a hard-fought competition and when I joined SGIA in 2014 the printers in the output device sub-categories were judged on the basis of subjective evaluations such as color fidelity, gray balance, print quality, etc. There were no metrics used, it was the expert eyes of judges looking at each print and scoring it to the best of their ability. And I should note that they were judged the day before the trade show began in the not-so-well-lit hall as set-up was underway. Now I’m not saying that this was a bad way to judge a print, but it certainly was not an ideal way to judge them. The hope was to bring in objective metrics based on industry specifications to guide the selection process.
At first, the 84 patch control strip from ISO 12647-7 Control Wedge 2013 was used to give feedback on neutral gray, solid ink values, and RGB overprints and then 8 spot colors were evaluated one-at-a-time to give an average Delta E (difference in color from the original). All-in-all, this was a big improvement to this process and the best prints (as measured against these metrics) lined up with the prints that the judges scored highest in almost every case. So a broken process was made better, but still lacked much in the way of automation – each spot color had to be measured and recorded one-at-a-time and was quite tedious.
Fast forward a couple years and while working with my friend and colleague Dan Gillespie (now with Alder Color Solutions and an instructor in the SGIA Color Management Boot Camps) a new metric was designed. This new patch set is made up of 1,717 color patches – 1,617 CMYK patches (TC1617), plus 84 RGB patches (AdobeRGB), 8 solid spot colors, and 8 50% tints of those spot colors. This first-of-its-kind chart generates a report card (using Chromix Maxwell cloud software) giving a full performance report for each printer.
Some of the metrics that are reported included the percentage of AdobeRGB that a device can achieve, the average delta E of the spot colors and the average delta of the spot color tints at 50% (using SCTVi), the solid ink values and the RGB overprint values. Using ISO PAS 15339-1 and -2 we can see how tight the tolerance is with NPDC (neutral print density curves), gray balance, and the Average Delta E of the CMYK patches right on the report card.
The report card also includes a number of charts and graphs so that one can see at a glance the gamut size of a device, the Neutral Print Density Curves in both CMY and K, along with a CRF (Cumulative Relative Frequency) graph that easily lets you see how well a device is printing. This report card is comprehensive and since each category is printing on the same type of material (and of course the white point of a substrate is a big factor in print quality and you can see the white point value in the report) the devices are compared on an apples-to-apples playing field. The best part was partnering with the Sonoco Institute at Clemson University as our third-party evaluator. Using an X-Rite i1iO table automated this process, and having an independent group doing the work made this process much more manageable.
Of course, the entire print is not made up of test charts. Photos are selected each year with some highlighting local interest or flavors and others for their challenging saturated colors. This year the Atlanta skyline is featured along with a Savanna streetscape, and Starrs Mill. Fine type and lines are also part of the test print because at the end of the day, print is evaluated by people, not a color measurement device. Judges still score the prints – looking for banding, image quality, color gamut (compared to a control print), and color and gray balance to assess the print. These judges scores are used as the base for the scoring and the report card data on color differences serves as deductions from this base score. Again, as in the past, the objective scoring has lined up very closely with the judges evaluations.
Now that you have seen a glimpse behind the scenes, I hope you’ll watch for the Product of the Year winners in this year’s virtual gallery and check out their printers at PRINTING United in Atlanta. Stay safe and I hope to see you in October.
Download the Free SGIA Product of the Year test file here.
Free Download - The SGIA Product of the Year Test File
One of the many joys from being a part of the Product of the Year competition is the effort to make the judging for all categories as objective as possible – and nowhere is this more evident that in the Output device categories. A quick step back to make sure that we are all on the same page. Product of the Year has been running at the former SGIA Expo (now PRINTING United) for a number of years – at least since 2009 and maybe even a bit before then. And in the Product of the Year competition there are two main categories – output device and non-output device. Output device is exactly what you think it is – a way to have printing devices compete in sub-categories (think UV Flatbed, and Solvent/Latex roll-to-roll) against each other for the bragging rights of Product of the Year in each of these sub-categories.
This is a hard-fought competition and when I joined SGIA in 2014 the printers in the output device sub-categories were judged on the basis of subjective evaluations such as color fidelity, gray balance, print quality, etc. There were no metrics used, it was the expert eyes of judges looking at each print and scoring it to the best of their ability. And I should note that they were judged the day before the trade show began in the not-so-well-lit hall as set-up was underway. Now I’m not saying that this was a bad way to judge a print, but it certainly was not an ideal way to judge them. The hope was to bring in objective metrics based on industry specifications to guide the selection process.
At first, the 84 patch control strip from ISO 12647-7 Control Wedge 2013 was used to give feedback on neutral gray, solid ink values, and RGB overprints and then 8 spot colors were evaluated one-at-a-time to give an average Delta E (difference in color from the original). All-in-all, this was a big improvement to this process and the best prints (as measured against these metrics) lined up with the prints that the judges scored highest in almost every case. So a broken process was made better, but still lacked much in the way of automation – each spot color had to be measured and recorded one-at-a-time and was quite tedious.
Fast forward a couple years and while working with my friend and colleague Dan Gillespie (now with Alder Color Solutions and an instructor in the SGIA Color Management Boot Camps) a new metric was designed. This new patch set is made up of 1,717 color patches – 1,617 CMYK patches (TC1617), plus 84 RGB patches (AdobeRGB), 8 solid spot colors, and 8 50% tints of those spot colors. This first-of-its-kind chart generates a report card (using Chromix Maxwell cloud software) giving a full performance report for each printer.
Some of the metrics that are reported included the percentage of AdobeRGB that a device can achieve, the average delta E of the spot colors and the average delta of the spot color tints at 50% (using SCTVi), the solid ink values and the RGB overprint values. Using ISO PAS 15339-1 and -2 we can see how tight the tolerance is with NPDC (neutral print density curves), gray balance, and the Average Delta E of the CMYK patches right on the report card.
The report card also includes a number of charts and graphs so that one can see at a glance the gamut size of a device, the Neutral Print Density Curves in both CMY and K, along with a CRF (Cumulative Relative Frequency) graph that easily lets you see how well a device is printing. This report card is comprehensive and since each category is printing on the same type of material (and of course the white point of a substrate is a big factor in print quality and you can see the white point value in the report) the devices are compared on an apples-to-apples playing field. The best part was partnering with the Sonoco Institute at Clemson University as our third-party evaluator. Using an X-Rite i1iO table automated this process, and having an independent group doing the work made this process much more manageable.
Of course, the entire print is not made up of test charts. Photos are selected each year with some highlighting local interest or flavors and others for their challenging saturated colors. This year the Atlanta skyline is featured along with a Savanna streetscape, and Starrs Mill. Fine type and lines are also part of the test print because at the end of the day, print is evaluated by people, not a color measurement device. Judges still score the prints – looking for banding, image quality, color gamut (compared to a control print), and color and gray balance to assess the print. These judges scores are used as the base for the scoring and the report card data on color differences serves as deductions from this base score. Again, as in the past, the objective scoring has lined up very closely with the judges evaluations.
Now that you have seen a glimpse behind the scenes, I hope you’ll watch for the Product of the Year winners in this year’s virtual gallery and check out their printers at PRINTING United in Atlanta. Stay safe and I hope to see you in October.
Download the Free SGIA Product of the Year test file here.
Ray assists association members with information on digital printing as well as digital equipment, materials, and vendor referrals. He oversees training and certification workshops at PRINTING United Alliance. Ray is project manager for both the PDAA Certification program and the PRINTING United Alliance Digital Color Professional Certification program and is an instructor for the Color Management Boot Camps as well as a G7 expert. Ray regularly contributes to the Association's Journal and won the 2016 Swormstedt Award for Best in Class writing in the Digital Printing category. Ray was inducted into the Academy of Screen and Digital Printing Technologies (ASDPT) in 2020. He also works with SkillsUSA to conduct the National Competition for Graphics Imaging Sublimation. Outside of work, Ray enjoys biking, international cuisine and spending time with his three fantastic grandkids.