Community Tip - Visit the PTCooler (the community lounge) to get to know your fellow community members and check out some of Dale's Friday Humor posts! X
I am using Creo 3 with a Nvidia Quadro K2000 card driving 2 off 28" 4k screens . The resoloution is 3840x2160
I can increase text sizes in Windows to make normal screen use feasible with normal human eyesight!! By increasing text to 200% it is legible and much like a normal non 4k screen but clearer.
When I open Creo 3 it looks lovely, The fonts are scaled up in accordance with windows. And then you try using it. You realise the the lines are so thin that it is hard to discern their colours and pre selection is hopeless as no difference can be de3tected.
The 'expand' triangles on the model tree and the sweet spot of the dimension numbers are so small that it takes 4 click for it to work. The hit radius is minute. These items have not scaled with the rest so selection is a nightmare and very slow.
And in addition I am struggling to get the model (1Mb) to rotate fast enough with a K2000 card. I have a very fast PC and 32Gb RAM. I tried changing things like display quality and anti aliasing but it causes it to go so slow the card crashes windows by exceeding windows time out limit of 2 seconds. So Creo crashes.
Anyone else using a 4k card and 28" screens? Anyone any ideas?
Mike
What happens at 1920x1080 res? And yeah I'm not sure a K2000 is going to push 2 monitors at 3840x2160 well. Maybe someone else is doing it can respond.
If I'm reading NVidia's spec correctly, it looks like it should be able to drive 4 monitors simultaneously at that resolution (via Display Port).
Whilst reading the Nvidia spec is interesting it reveals little beyond the fact that, yes, its possible to run 4 monitors via display port. When you actually load a CREO3 model and try and view and manipulate it the story is, for me, somewhat different. It runs slow, such that using a 3dConnexion spacepilot is pointless as the model lags so much. Touching the display settings under Options was for me a mistake as it is now poorer that before and I cant get back to the original settings. Any sort of lag is immediately noticeable with a spacepilot.
I am not in any way certain this is a Creo or nVidia problem. of course it may be a hardware issue. But I have the latest nVidia driver, 32Gb Ram a 4 core Xeon processor that is currently 4800 on the cpu passmark charts in a Dell T5400 with Windows 7 Ultimate fully up to date.
I assume others are, or soon will be using a similar hardware configuration and its important users know what limitations they might face.
It would be really interesting to see how the graphics card is performing while trying to roll the model around. I wouldn't be surprised if the issue wasn't with the CPU (probably single thread process) instead of some GPU limitation. Have you run any GPU visual benchmarks? How does it do with only one monitor enabled? How does that compare to having three monitors enabled? I could be completely wrong, but I highly doubt Creo is maxing out the GPU. The only way to know it to monitor both while rolling the model around.
I was thinking the same thing - however doubling the pixels in X and Y means 4X the pixels to push. Even a fast GPU will see a drop-off in responsiveness. If PTC is using OpenGL right, there is almost no CPU involvement in spinning.
The rest of the problems are systemic with increased pixel density. If the screen was a 48 inch screen the lines and icons would be legible.
Eventually the second interface limit gets hit - where it takes a lot of pulses in the mouse to move the pointer across the screen, so you need to move the mouse a foot or two to go from one screen edge to the other -or- the sensitivity goes up so that the mouse is too touchy. Adaptive sensitivity/mouse acceleration can help, but then this causes wandering mouse syndrome where the user has to lift and reposition the mouse to make up for asymmetric accumulations. Which is why I've moved to a thumb-ball mouse.
David Schenken wrote:
I was thinking the same thing - however doubling the pixels in X and Y means 4X the pixels to push. Even a fast GPU will see a drop-off in responsiveness. If PTC is using OpenGL right, there is almost no CPU involvement in spinning.
I think that's a big "if". Typically on my system (a fairly fast Xeon, E5-1620) spinning framerate is limited by one CPU core being flat-out, with very low GPU usage (<10%) unless I have transparency enabled (and some transparent parts). Note that when looking at Passmark score, it's the single-thread value that matters (around 1900 for my CPU; ~2500 for the very fastest i7).
However, with 4K resolution I can imagine that either GPU memory bandwidth or simple GPU processing power may become a limitation. If you run a monitoring program such as GPU-Z, what GPU usage does it show while spinning?
If the CPU is busy during spinning, then either hardware acceleration isn't being used (win32_gdi) or PTC is wasting perfectly good graphics cards. It's possible the models you have aren't significant for the unlisted GPU you are using and the CPU activity is just busy polling the mouse.
In ProE and in Creo Parametric, CPU is always busy during model spinning (this is my opinion). I have never read or heard detailed explanation (presented by PTC R&D expert) how ProE or Creo Parametric masters model spinning. Maybe this information is TOP SECRET .
Martin Hanak
David Schenken wrote:
If the CPU is busy during spinning, then either hardware acceleration isn't being used (win32_gdi) or PTC is wasting perfectly good graphics cards.
It's a Quadro 4000; I'm pretty sure that none of the configs I use have the win32_gdi option; and framerate drops with model size, so I'd say it's the latter.
In fairness, Catia is not dissimilar - spinning a pretty simply model just now pegs the CPU at 28% (so one core flat-out, plus a bit), albeit with the GPU around 65% according to HWMonitor. A quick test of a very simple model in Creo gives 30% CPU and 8% GPU (if I wave the mouse around frantically); interestingly, a significantly larger assembly in Creo hits about 20% CPU and 55% GPU, and actually still feels very responsive.
Yeah, that's why I said well. And it's interesting that it says 4 but only has 2 DP ports. That is a LOT of pixels to push, and I am sure anything like email and youtube videos, etc would be fine.
This problem will probably require a lot of development effort to deal with. I believe that the graphics subsystem is generally set to use 1 pixel width lines, so as the pixels get smaller, the lines they are made of become less visible. Maybe the solution can come from the card maker as a custom control for stroked lines.
Adobe dealt with a similar problem. In PDFs lines are given widths, but when zoomed out, those widths can be smaller than 1 pixel. In grayscale displays Adobe would use aliasing to approximate the lines by changing the pixel color to represent the proportion of line; if a black line would cover 1/2 the pixel, the pixel would be 50% gray. However, if the lines were thin enough they would essentially vanish, so Adobe added a setting to round up to the nearest pixel width, so even very thin lines would be 1 pixel wide, and thus generally more legible.
The PTC problem is their lines don't typically have a real width associated with them, and even if they did, they will eventually run into the same problem Adobe when 1 pixel is too small.
The best answer is for Microsoft to use SVG for it's interface elements and build in scaling to the screen size so that icon sizes are invariant without having to build them as bitmaps at different sizes. PTC could make their own subsystem as well, but as I started with - it will take a lot of development.
David you have explained the issue that I feared. If this is the case then we have a situation where a technology advance in monitors and graphics cards is being supported and promoted by PTC when the combination may be unusable for some in practice. I sincerely hope this is not the case. If it is then I seem a fool for being an early adopter. Yet I did my research as best I can going through all the documentation from PTC and nVidia and looking for any recommendations between the nVidia and ATI cards. The only issue there that still concerns me is that the latest Nvidia approved graphics driver remains for Croe2 not Creo3.
Surely PTC have tested this hardware combination or similar and can comment? If they have then their advice as to how to optimise the use of 4k screens is valuable to users. Can they help? This seems possibly something the average reseller supporting users may not have come across yet.
Just noticed "the sweet spot of the dimension numbers" I don't know if it will help, but try the config option:
pick_aperture_radius
Specifies the size of the area about the mouse when making selections. Units are 1/1000 of screen size.
Default 7