Posts Tagged ‘Network performance’

Since latency has so much to do with performance I wanted to blog about the best practices that users can do to minimize the effect of latency on the user experience.  Following these simple steps in your daily routine can substantially decrease the total number of transactions with the server and therefore increase perceived performance.

  • Turn off the Preview Pane in the ProjectWise Explorer Client within high latency offices. This will limit the amount of data and transactions needed by the Client during folder and file navigation.
  • Limit the use of custom Views, only displaying the minimal amount of data about each file.
  • Limit the use of titleblock integration with AutoCAD and MicroStation.
  • Limit the amount of attributes in any Environment being used in the high latency offices.
  • Utilize Caching Server(s) where possible in the high latency offices as file storage.
  • Utilize Caching Server(s) where possible in the high latency offices as file caching.
  • Utilize Fetchfile.exe to pre-populate local file caching.
  • Refrain from exiting MicroStation or AutoCAD when opening additional files.
  • Limit the number of files in any one folder to a manageable amount.
  • Navigate by using the folder tree in ProjectWise and not selecting sub folders in the contents list.

EJA

Advertisements

This is the first of what will be many blogs about ProjectWise and Performance.  I’ll cover subjects that will help you understand what makes ProjectWise perform like a finely tuned sports car or a bus during rush hour.  The first topic to cover is latency and its effect on performance.

Much of the anticipated performance can be determined by the latency between clients and servers.  In today’s digital age, bandwidth and latency determine the speed at which you receive your data. The nice thing is that money will usually buy you more bandwidth, but when it comes to latency, that is not always the case. Latency is the measurement of the time it takes a packet of data to move back and forth. The data must go from the user interface into the kernel, out the network card, to switches, firewalls, routers, back to a network card, into the kernel, and then back through the same route. The time it takes this whole process to happen is referred to as latency. This operation may be repeated thousands of times per minute.  Therefore, a high latency returns poor performance regardless of bandwidth, which determines the amount of simultaneous packets that can be sent.

The chart below can help set expectation of performance. 

latency

In future blogs I’ll discuss ways that you can minimize the effects that latency will have on the user experience.

EJA