Hi Everyone,
I joined this forum looking for some insight into some issues we've been having at the printing company where I work as a graphic designer/IT manager.
We have a client who prints roughly 50-60k VDP postcards each week. The workflow is as follows: laid out in indesign using xmpie udirect standard 4.6 > linked to csv file > a "data"(variable data fields only) and "master"(backgrounds only) file are then created > master is sent to the docu sp software on the front end of an igen 3 >data file processed in chunks of ~3k records 9 up > data files sent in pdf format to docu sp > master and data merged on docu sp> print
We now have a problem. Instead of printing straight text, the client is now using a design that has heavy effects applied to the variable text fields. When the pdfs are merging in the docu sp software, we are getting the huge white knockout boxes overtop of the background image.
I'm very familiar with the knockout/transparency issue and have resolved it in the past by changing the version of acrobat compatibility level, or by removing spot colors. There are no spot colors in this job, and switching to acrobat 6 compatibility from 5 did not help.
Xmpie support tells me that it's the docu sp that isn't merging the data correctly. They recommended several workarounds that will just not work: a.) creating postscript files - files too large b.)merging data and master from xmpie - files too large, processing on xmpie and docu sp too long c.)ppml - again, files too large. When I say "too large", I'm talking about 20 records resulting in a 182MB file. Again, we're used to doing 3k records at once.
I understand that the effects are the bottleneck on the processing machine, but the docu sp is not combining the files properly either.
I manage the entire frontend with indesign and xmpie, but a press operator manages the docu sp system of which I have no experience or knowledge (other than him telling us to "flatten" files or to re-send them because they are not processing). The processing machine is a mac dual core dual processor 2.6 xeon intel system. By ways of comparison, our regular text only records process at roughly 5 per second. These new files (processing only the data templates) take roughly 2 seconds per record. We know that a hardware upgrade on the front end may be inevitable, but I figured I'd ask here to get some ideas from other people who may have experience with xmpie or other alternative plugins that may work quicker, or produce files that are more manageable by the docu sp software. Any help or advice is welcome! Thanks!
I joined this forum looking for some insight into some issues we've been having at the printing company where I work as a graphic designer/IT manager.
We have a client who prints roughly 50-60k VDP postcards each week. The workflow is as follows: laid out in indesign using xmpie udirect standard 4.6 > linked to csv file > a "data"(variable data fields only) and "master"(backgrounds only) file are then created > master is sent to the docu sp software on the front end of an igen 3 >data file processed in chunks of ~3k records 9 up > data files sent in pdf format to docu sp > master and data merged on docu sp> print
We now have a problem. Instead of printing straight text, the client is now using a design that has heavy effects applied to the variable text fields. When the pdfs are merging in the docu sp software, we are getting the huge white knockout boxes overtop of the background image.
I'm very familiar with the knockout/transparency issue and have resolved it in the past by changing the version of acrobat compatibility level, or by removing spot colors. There are no spot colors in this job, and switching to acrobat 6 compatibility from 5 did not help.
Xmpie support tells me that it's the docu sp that isn't merging the data correctly. They recommended several workarounds that will just not work: a.) creating postscript files - files too large b.)merging data and master from xmpie - files too large, processing on xmpie and docu sp too long c.)ppml - again, files too large. When I say "too large", I'm talking about 20 records resulting in a 182MB file. Again, we're used to doing 3k records at once.
I understand that the effects are the bottleneck on the processing machine, but the docu sp is not combining the files properly either.
I manage the entire frontend with indesign and xmpie, but a press operator manages the docu sp system of which I have no experience or knowledge (other than him telling us to "flatten" files or to re-send them because they are not processing). The processing machine is a mac dual core dual processor 2.6 xeon intel system. By ways of comparison, our regular text only records process at roughly 5 per second. These new files (processing only the data templates) take roughly 2 seconds per record. We know that a hardware upgrade on the front end may be inevitable, but I figured I'd ask here to get some ideas from other people who may have experience with xmpie or other alternative plugins that may work quicker, or produce files that are more manageable by the docu sp software. Any help or advice is welcome! Thanks!