cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Need help navigating or using the PTC Community? Contact the community team. X

Slow performance on 50+ page books

ptc-1929204
1-Newbie

Slow performance on 50+ page books

I'm using a new styler stylesheet and the Publishing Engineto generate part books and it works perfectly….. until we try to run one over 50 or so pages. The server will run for 20 minutes and abort the job. I have viewed the diagnostics and performance on the server while it is running and it looks like the cpu is only running at about 20% and the memory isn’t even being touched.

I have done the following on the serverand nothing has made a difference so far:
1. Changed the windows server virtual memory from 2046 to 11902 (the recommended amount)
2. Changed arbortext heap size from 759 to 3000

Would anyone have any guesses as to how to configure the server to handle a 130 page book? The box has 4 processors and 7 gigs of ram! One would think it should be able to rip through these books in no time…

18 REPLIES 18

just an idea, is there something different in
those additional pages? different format of the
graphics? network location differences? more
complex tables (less likely due to comments on resources).

..dan

Have you checked the timeout? It's possible it's quitting at the appointed
give-up time. Does anything run longer than 20 minutes successfully?

There is nothing different about the books besides the amount of pages.

I suppose it could be timing out at 20 minutes, but I was thinking it should be completing well before this time limit. Would anyone know about how long a (relatively simple) 130 page book should take to complete?

Thanks

md

Did you use many XPATH expressions in your StylerSheet? XPATH
expressions could contribute to processing time.



-Jean K.


I'm not remembering the details, but there are certain things that cause
geometrically increasing processing times. XPATH lke Jean suggests may be
one of them. ACL calling system-funcs rings a bell.

With XPath in particular, it depends a lot on the specifics. It's easy to
inadvertently write XPath expressions that take exponential (w.r.t. document
size) time to evaluate, but if you're careful you can use XPath pretty
efficiently. A very general rule is, the more specific you can be about the
starting node the better-avoid "//" as much as you can, especially at the
beginning of your expressions.



There are several places online that have guidelines for optimizing XPath
expressions (usually in the context of XSLT stylesheets). Google is your
friend here.



--Clay



Clay Helberg

Senior Consultant

TerraXML


I suppose I could look at the xsl, but someone else is creating it with styler (and that's a mess to look at). I do have a feeling that is has something to do with how styler generated the xsl.
byork
17-Peridot
(To:ptc-1929204)

I think we have narrowed the culprit down to tables. Ifwe publish a book with 56 pages it takes about 6 minutes and all but three our four pages have tables. If I remove the information from the table and leave a couple of rows I get the book in about 3 min. Anybody know if using the cals table model causes that big of performance hit and if there is something different we should be doing.

Brian,

Tables have always taken longer to format than flowing text in Arbortext Editor.

If your documents require more than one formatting pass, and you are not already using formatting pass reduction, search the Help for Improving formatting speed.

Good luck!
Suzanne Napoleon
www.FOSIexpert.com
"WYSIWYG is last-century technology!"


Try eliminating any table footers you might have if only to see if there is
an improvement in performance. tfoots are
think-ahead-but-remember-where-you've-been elements. Ouch.

I've said it before and I'll say it again: "Tables are the spawn of
Satan!"


Hi Brian, All,

We ran into a problem as well with using large tables, try 600 pages of
10 column parts list and wire lists. I had some good advice one time
from the group, I believe Lynn Hales and others suggested entering the
following on the command line so our sequence of commands looks like
this:


source "ourpublicationsaclfile.acl"

$d=doc_open("yourdocumentname.sgm")

current_doc($d);set fosi=yourfosiname.fos

current_doc($d);format allpasses force wait;print panel


I believe what happens is that you are suspending the rendering of the
display of "yourdocumentname.sgm" and any print preview. In the case of
our very big pub, which was essentially just one long table, it used to
take 12 hours or more and I'd hope it would be ready to print when I
came back the next morning. Using this "$d=doc_open" method, it would
resolve in one hour or so, and be ready to print. Note - When using the
command line, it appears nothing is happening after the last command but
it is, patience. If you hit enter you'll get a message telling you to
wait or something along that line...

A few contributing factors:

* Our documents are in sgml, hence the *.sgm filenames
* We use print composer.
* We use Epic 4.3 for production environment at the moment (I've
got 5.3 going but need to compile DTDs etc., sometime soon no doubt)
* I'm not sure if having a flat file (if you use entities) would
help reduce the time down. We often use the command line
"entity_flatten(all)" to combine all entities into one publication.
* We strip all un-necessary/unused markup out of the tables.
Removing un-necessary/unused language tags, etc.
* I believe that working locally, on a desktop PC, as opposed to
accessing files over a network can also speed things up. I've created a
local "epicuser" folder on "C:" for working with some of our really ugly
files.

Other commands in this vein:

current_doc($d);format allpasses force wait;preview
current_doc($d);format allpasses force wait
current_doc($d);print panel
current_doc($d);set fosi=C:\epicuser\yourfosiname.fos;format
allpasses force wait;print panel


Anyway, that's my 2 cents. I haven't had to run one of these monsters
for a while. I'd be interested to hear how things work out.

Greg
🙂

byork
17-Peridot
(To:ptc-1929204)

Thanks for all the help this far.

Paul:
I checked my tables and don't have any table footers so I'm good there.

Suzanne:
By default does editor use formatting pass reduction? I tried a couple of things there just not sure I'm making the right switches, but nothing seemed to help.

Greg:
Thanks for the suggestions. I tried what you had said but not really any improvements. I am using PE and have some xsl-fo.

Ed:

I’m starting to agree with you. J

All –
Is there a better way to create parts list than using tables. Would a custom table model publish faster?

Brian,

Re: formatting pass reduction, the FOSI has to be coded in a certain way for it to work. Check out help 188. Basically, you need to reserve enough horizontal space for page numbers to be overlaid on the pages created during the first formatting pass. Also, be sure that APTNOOVERLAYPAGENUMBERS is not set to yes, which disables formatting pass reduction.

Re: not using tables, algroup and indent can be used to align elements into columns, with ruling and possibly boxing providing table and cell borders. Before gentables, the algroup approach was the only way to align non-table markup. The output is flowing text, which formats faster than table markup. However, the algroup approach may not support the desired formatting. Then again, some fancy FOSI footwork may do the trick.

Good luck!
Suzanne
www.FOSIexpert.com
"WYSIWYG is last-century technology!"


> Is there a better way to create parts list than using tables. Would a
> custom table model publish faster?


Take a look at the help for index formatting. Somewhere in there is a parts
list tutorial using index code. Whether it will be faster or not, though, I
don't know.

--
Paul Nagai

All,
Been busier lately than <insert favorite=" metaphor=" here=">, so hadn't had
time to respond to this. If I remember correctly, someone mentioned
earlier that there might be a time-out issue. My input is in regard to
that issue.

We are using 5.2, composing PDF from Editor on a PC, using PE on a
server. We very regularly compose XML that generates
multiple-thousand-page PDF files. Twenty minutes wouldn't begin to touch
one of those.

In correlation with that, consider that most of our files load into
Editor in a matter of minutes, but one of them (not the largest by any
means) takes literally 4 (four) hours to load. Something in its inherent
structure seems to make the parser 'thrash'.

Right at the moment, I don't remember what the server time-outs were
reset to be, but it is in terms of hours, and I believe it's in the
neighbor hood of 2 days. Yes, getting it to compose faster would always
be good, but...

My bottom line on this is, sometimes, as in our 4-hour load time, you
just can't find what it is that drives the Editor/PE lags. You just have
to live with it. If the process is failing, and your time-outs haven't
been extended, I'd strongly suggest doing that first.

Or are you not using PE? As I said, been killer hectic around here all
week...

Sincere Regards,
Steve Thompson
+1(316)977-0515
byork
17-Peridot
(To:ptc-1929204)

Steve,



Thanks for the info. Just seems that a 56 page book shouldn't take 7 or
8 minutes to compose.



Can you tell me how to cheek and set the timeout value on PE? I haven't
worked with PE configuration very much.



Thanks,

Brian
byork
17-Peridot
(To:ptc-1929204)

We were finally able to figure out what was going on. We had a bunch of nested page sets that weren’t noticed that were causing unnecessary flattening by PE. Once we fixed that we had books working much faster. It now takes about 4 minutes to run a 56 to 60 page book. Thanks for all you help.
Announcements