Resolution

While most CTP's are 2400 dpi or 2540 dpi


Does anyone know the origin of the various resolutions that CTP and imaging devices have evolved from?

2400 is a multiple of 300

2540 dpi is a metric equivalent to 1000 dpcm



Why are 2438.4 and 1219.2 used?

Tim
 
I am trying to determine the origin of the 2438.4 resolution standard.

2400 ties to 300 dpi because it is a multiple of 300

2540 is a logical metric conversion from 1000

2438.4 does not tie to a standard english unit of measure or metric. I have looked at the
various point size standards and none of those seem to line up. I do know that Hp Indigo
presses use 812.8 dpi which is a factor of 2438.4. The 2438.4 resolution has been used
a resolution on output devices back to the early film days. 2400 and 2540 seem to be the most prevalent within the industry.
 
I am trying to determine the origin of the 2438.4 resolution standard.

2400 ties to 300 dpi because it is a multiple of 300

2540 is a logical metric conversion from 1000

2438.4 does not tie to a standard english unit of measure or metric. I have looked at the
various point size standards and none of those seem to line up. I do know that Hp Indigo
presses use 812.8 dpi which is a factor of 2438.4. The 2438.4 resolution has been used
a resolution on output devices back to the early film days. 2400 and 2540 seem to be the most prevalent within the industry.

I haven't been able to find any info in my library - so the following is speculation.

Remember that technically "dpi" is actually a measure of addressability rather than "resolution"

2400 dpi was probably chosen because it is the lowest resolution where the imaging mark was below the resolution threshold of the human eye. I.e. at normal viewing distances you can't see a 10.6 micron dot. (10 micron dot for a 2540 dpi device). So, to use a higher dpi would have been a waste of processing and imaging time.

2438.4 dpi may have come from the imaging mark size of some early film/paper imagesetters especially those that were scanners used to output film. I've only seen it referenced in ppds like this for the Fuji Celix 4000
*Resolution 1219dpi/1219.2dpi: "1 dict dup /HWResolution [1219.2 1219.2] put setpagedevice"
*Resolution 1828dpi/1828.8dpi: "1 dict dup /HWResolution [1828.8 1828.8] put setpagedevice"
*Resolution 2438dpi/2438.4dpi: "1 dict dup /HWResolution [2438.4 2438.4] put setpagedevice"
*Resolution 3657dpi/3657.6dpi: "1 dict dup /HWResolution [3657.6 3657.6] put setpagedevice"
*Resolution 4876dpi/4876.8dpi: "1 dict dup /HWResolution [4876.8 4876.8] put setpagedevice"

gordo
 
Last edited:
Gordo thanks for the response. I am guilty at times of using dpi and ppi as though they are the same.

I did finally figure it out. As a scanner operator from decades ago I should have made the connection faster.

A 300 PPI image is/was an industry standard 150 LPI halftones and 150 LPI halftones and 150 LPI halftones were commonly output at 2400 dpi to achieve the proper number gray levels per halftone dot.

Most drum scanners were metric, so instead of scanning at a true 150 lpi they digitized files at 60 LCM or 120 PPCM
NOTE: Halftone resolution was commonly half of the digital image resolution or a digital image had 2 pixels per halftone line.

A 60 LCM halftone was actually digitized at 120 PPCM

120 x 2.54 = 304.8

The difference between the 2438.4 and 2400 is due to the rounding of the metric halftone resolutions.

304.8/300 = 1.016 ( 2400 x 1.016 =2438.4 )
 

PressWise

A 30-day Fix for Managed Chaos

As any print professional knows, printing can be managed chaos. Software that solves multiple problems and provides measurable and monetizable value has a direct impact on the bottom-line.

“We reduced order entry costs by about 40%.” Significant savings in a shop that turns about 500 jobs a month.


Learn how…….

   
Back
Top