- Switch off debugging output. All errors and warnings should go to logs and not your site output. You might leave it for your office IP’s, if you have a way to see what others see (additional IP’s or proxies). You do not want others to know what happens in your site, especially if your debugger leaves your database connection information visible (security breach).
- Once again, use your site as a simple, new user. You should look for usability and functionality problems, in most popular browsers preferably.
- Stress-test major page types on your site. A good free tool for this is ab (apache benchmark), included with apache distribution, though I am sure there are better ones. Make sure your site is usable.
- Make list of things you could optimize in the future if necessary.
- Check all links for duplicates and 404 errors. Good way to check site for 404 is running a site grabber from your PC and checking error logs for missing pages. Adobe acrobat is quite good grabber for smallish sites (several thousands of pages), as it compiles a document where you could search for duplicates.
- Check for major security holes:
- Try to enter quoted text in all the forms, they should be escaped in database. If not, your site is vulnerable to SQL injection
- Check for cross-site scripting vulnerabilities, i.e. including files through urls
- Change password for administrative user
- Check if your sitemap and RSS regenerates successfully after you add information. On-the-fly RSS and (especially) sitemap generation might be slow.
- Check SEO of your site – titles, headings, keywords, robots meta-tag, robots.txt, etc.
- Check if site is accessible from “outside” – .htaccess allow/deny lists or blocks in code.
- Submit your sites sitemap to Search Engines. Include sitemap in your robots.txt file as well.
- Add links to several pages of your site in other sites. Good places to start are related blogs and directories. If you have a blog in new site, make a post and make sure it pings blog search engines as well. This speeds up indexing of website.
- Check access, error logs and server loads after a while. Look if spiders started to crawl your website.
Categories: ProgrammingSEM
0 Comments