Our Writing Projects

are managed using a generally
accepted project life cycle
based on

PMI-PMBok™
Project Management Institute
PM Book of Knowledge

Supervised by our Pm4hire Affiliate

Testing

Depending on where you host your website, you cannot always be sure that the site information is accessible whenever you want.   To monitor the site access you need to make regular site visits, several times a day, and make sure that your site provides the service you want it to offer.  If the hosting service is not reliable you need to find out to consider moving your business elsewhere.

Following up on the site performance is a tedious exercise.  We created a robot process to automate site visits, and to confirm that the site is responding within the parameters you have set. To monitor your website performance, check here:




Broken Links

This robot process is setup to run maybe once or twice a month, using extracts of your web pages to find external site references.  The robot will access those references and confirm that the link is still working, so that your site visitors can indeed access those external sites through the links you have provided.

·         Like your website, external websites need periodic maintenance, and while we make every effort to maintain the consistency of our websites over time, other providers may not be so careful.  As a result, a link to information on another site may be here today and gone tomorrow.

·         The broken links robot will automatically extract external references and track where they occur in your website.  That means only one external site access is required for each unique link, no matter how many times we encounter that in different parts on your website.  All we need is an updated version of your site.

·         When a broken link is found, we can create an E-mail outlining what is broken so that you can decide how this has to be fixed.  Sometimes we can locate that information on another page on the same site, but that cannot be guaranteed.

Performance

This robot process is setup to run several times a day, trying to reach each of your web pages in succession.  It allows a fixed time to pass before it verifies that the web page has loaded and is visible to the browser, or else it cancels the navigation and reports the failure to present the web page within the allowable time limit.

·         Most websites will appear to have an occasional slow-down that creates the appearance of poor performance.  To avoid that, the robot maintains a history of when the sessions were executed, what pages failed to load in time, and if this was an occasional failure or a repeat failure.

·         We can help you to analyze the performance to rule out generic problems, even performance issues unique to our site queries through our service provider.  For persistent problems without an explanation we can make recommendations to consider alternate hosting services.  For problems inherent in your website we can make recommendations to upgrade the site.