Todays date: 12/30/2024
Last blog entry: 7/27/2004
Last Article entry: 12/4/2003
 

  T's search engine optimization blog, or some will say, diary of a ........ artist, haw ha ha!

"T's search engine optimization Blog and SEO News"

"opinions are like asseholes, everyone has one, take from mine what you want and forget the rest of it!"
Da' Tmeister, Editor
 
 

search engine optimization and submission articles
h_line2.gif (398 bytes)

A Dynamic Solution for Dynamic URLs

By Terry Van Horne AKA Webmaster T

As a web developer and programmer I have been building and optimizing dynamic sites for the past 4 years. I knew going in that there were some problems with optimization of database-driven sites. The first step in programming a solution is identifying the problems you'll encounter implementing a solution. A major problem for dynamic sites is search engine visibility.

Search engine visibility, or the degree and depth a spider crawls a site, is crucial to all organic SEO campaigns. On static websites this can be controlled through document structure/design, link hierarchy/structure and site architecture. Dynamic sites provide challenges to link structure because some search engines' crawlers do not follow dynamic urls (www.site.com/followme.asp?dynamic=maybe).

The querystring (?dynamic=maybe) is the cause of the indexing problem. The engines affected are primarily AltaVista (Scooter) and Inktomi (Slurp). The others have less problems, but can still be troublesome.
Other querystring problems:

  1. Google Guy, a representative of Google Inc, has stated that there is a problem with session ID's in querystrings. Avoid using ID for the name in the parameter and use alphanumeric values if at all possible, particularly if the platform is NT. Strictly numeric values can have characteristics associated with Session ID's. NT servers use a numeric value for Session ID's (usually 6+ digits). Combining ID and a numeric value is riskier than ?NUM=1234. PHP is alphanumeric so it is possible a numeric value may be safer. At the very least, avoid using "ID" as a parameter name. Since sessions are often used to provide dynamic content other search engines besides Google may also be wary of urls containing qstrings.
  2. Avoid using more than 3 parameters
  3. Querystrings are often used to maintain state, so for a shopping cart the best solution includes a way to maintain state without the querystring.

Maintaining State Without Querystrings:
At first this seemed like a daunting task. The ultimate answer was staring me in the face for some time before it came to me. Cookies are a part of the solution, but because they are dependent on user settings it is advised to include an alternate tracking method. The method I developed does use parameters, however search engines or users have to execute a form in order to invoke parameters in the url. If cookies are enabled (search engines aren't cookie-enabled) then the session is tracked in NT sessions (phantom cookies) so no parameters are used to maintain state.

Parameters in Querystring:
I do use parameters, however I also use them to steer engines since I use the static .asp pages for sub categories. Since this is "redundant" rather than "duplicate" content, engines which don't like parameters won't follow them but since there is a static page with the same information formatted differently it isn't crucial to visibility.

Removing the Querystring:
The best answer to the querystring challenge is to remove it by programming the site using either a server or programmed solution to provide the parameters (?name=value) in the querystring to the program and ultimately the database. The usability gains accrued to users by removing querystrings is a benefit that should be included in your solution's value. Removing querystrings provides an easier url for a user to remember.

One workaround I read about uses a sitemap to link to static pages built from the dynamic pages. Though this is an easy thing to do for a small site I immediately dismissed this as a workaround rather than a solution. There is no point in making portions of a dynamic site static. It seemed to me this just defeated the purpose for it being data-driven in the first place. In my opinion, it also has huge scalability issues. After all, total visibility is the goal, so a sitemap (or group of sitemaps) with 3500 links isn't going to be very effective in inducing full indexing. Moreover, I'm not a big believer in the sitemap strategy even for static sites. It just seems like another workaround rather than a structured solution.

I didn't go for a server-based solution because I was concerned about the associated server load. Pathinfo (www.site.com/followme.asp/dynamic/tough) was very interesting, more so if it was a Perl solution, and VbScript is pretty verbose and its pathinfo methods looked kludgy to me, but then again, most MS methods look kludgy to me ;). Another notable server solution is mod rewrite on UNIX and either a component or comparable NT service for mod re-write.

The first solution I wrote was in Perl using SSI to embed the database parameters. The pages had static names (page.shtml), which was the ultimate goal for this solution. This worked well, but was eventually re-programmed as an .asp solution using embedded database calls in the page. To spiders the site appears static. I advised a client to implement the embedded SQL queries on a ColdFusion site with instant success. The site went from 100 pages in Google to over 5,000 in less then 6 weeks! The success is also partially due to the site having a PR5 for the entry page.

Whenever I'm providing a solution, I determine benefits to cost. Sometimes it is more cost effective to look at a solution of a non-programmatic nature. It is often effective to look beyond just what you can "personally" provide.

PositionTech, provides a one-stop shopping site for inclusion to all major programs. This may be a more cost effective workaround than implementing any other solution. The only real drawback to inclusion is that internal links aren't followed, so it only provides a partial solution to the indexing problem.

Implementing the inclusion program should include careful targeting of keyword phrases. Optimize as many pages as you can afford to. XML feeds can also be used to leverage the database for greater visibility. XML feeds can leverage the programming almost like putting it on steroids with the big caveat being that it can be expensive since payment is based on clicks. However, for anyone thinking of cloaking, this is far less risky and likely about the same cost.

See this post I made to I-search for an example of embedded SQL code and optimization of a page using includes and database programming. Bear in mind that although I have used .asp in the example code this embedded SQL solution can also be easily implemented in JSP (Java Server Pages), PHP and ColdFusion.

Edited by Robert Gladstein Raisemyrank.com

............... see ya at the top!
Da' Tmeister, Editor

posts on search engine optimization and submission
SEO Hangouts:

SEO Training Dojo w/theGypsy

 For less than the cost of a cuppa' coffee a day?
SEOdojo SEO Training As a certification and training committee member for SeoPros I found theGypsy's SEO Dojo has the best SEO patent library available. Not to mention the incredible peeps to learn with and from! 
h_line1.gif (303 bytes)
Social Media Hangouts:


h_line1.gif (303 bytes)
T's Quote:

"What the mind of man can conceive and believe, It can achieve."

Napolean Hill ~ Think and Grow Rich

v_line.gif v_line.gif

DoJoPeeps to Checkout!
Steve Gerenscer AKA Feydakin
Animal Charms
h_line1.gif (303 bytes)
Webmaster T's New SEO recommendation service. Search engine marketing, campaign monitoring and certification. Rating real results from active campaigns and services. See your site like a search engine does!

h_line1.gif
search engine optimization articles
  Looking for something you've read in the past in the Blog area or T's qued for publishing. Check the search engine Webmaster T's optimization and SEO Blog archives. If it was on the cover you'll find it there.
h_line1.gif (303 bytes)
  
  archives
  
All the old search engine placement and web development articles are archived here. Watch the searchable Directory evolve from the archive.

T's World Logo,  cover and awards graphics 
by and Copyright © 1997-2009  Markus Gemstad 
Copyright © 1997-2009 International Website Builders all rights reserved.