Tuesday, June 15, 2010

How to install Sharepoint 2007 in Windows Server 2008

Hi Folks! I am putting down the simple steps as how you can install SharePoint 2007 in Windows Server 2008.
There is one trick in this install. This is called slipstream install which means that the install of Sharepoint 2007 will carry the latest updates required in WSS 3.0 and SharePoint 2007 sp1 in one go. i.e during the install itself the installation package looks for the latest patches of the above two in order to complete the successful install.

Here is how you go about it:

1. Copy the installation directory from the Sharepoint 2007 CD/DVD to hard disk of your computer.
2. You will find separate folders for each type of hardware either 64 bit or 32 bit (x86). Copy the respective
    directory in the HDD. example: I have a 32 bit machine so I copied the entire x86 folder in the HDD of
    my PC.
3. Download the service packs of each WSS 3.0 and SharePoint 2007 in your HDD.
4.  After the complete download type the following command in the command prompt.
      name_of_WSS3.0_ServivePack /extract:C:\install
      and
      name_of_SharePoint2007_ServivePack /extract:C:\install
      
    where install is the new folder you create to save the extracted zipped files of the respective service packs.

5. Now, you copy the contents of this "install" folder in the 'Updates' folder under the Sharepoint 2007 server installation files. In my case went to the x86 folder and browsed the Updates folder to copy the entire contents over there.
6. After that I when i clicked on the setup.exe the installation continued and asked for Producut Key!

After this step the installation is simple and as usual.

Hope you enjoy SharePoint 2007 successfully in Windows 2008 server.

Wednesday, May 19, 2010

SharePoint 2007 to 2010 Upgrade strategies

All you SharePoint enthusiasts and professionals a good news that many of you already might be aware is that Microsoft has released SharePoint 2010! Great, isn't it ?!! But how to upgrade yourself to SharePoint 2010 ??

Fine... not to worry... as there are two approaches for the same. This can be decided based on the the hardware capabilities of your server (in which the 2007 version is running). I will straightly point out to processors... ya '64 bit' processors!!
If you have got this already in your SharePoint 2007 box running Central Administrator in it then you should go for "In-Place" upgrade.

The In - Place upgrade is simple, as you don't need to do anything extra other than running your 2010 CD/DVD for install and upgrade. If a compatible hardware is found i.e. '64 bit' processor (along with other RAM and HDD requirement) the Setup automatically switches itself to the In- Place upgrade mode.
The advantage is your farm configurations and settings are restored and you don't have to do anything extra on it. A smooth upgrade. :-)


The other approach is 'Database attach' Upgrade. This upgrade process is for the machines which don't have 64 bit processors already. For these machines you will have move the concerned DB's to another machine with 64 bit processor on it and SharePoint 2010 already installed via it's CD. This is a mandatory step for above upgrade approach. You should also understand that SharePoint 2010 version supports only 64 bit machines.
Once the 'Sharepoint Configuration wizard run' is completed in the new machine, you attach your databases from your existing 2007 farm to this new Sharepoint 2010 machine.
You will have to setup the configurations manually as the config DB attach doesn't works. One thing you should note is that if you miss any of the configurations and any out-of-the-box implementations done in your existing server farm you will loose the same in the new server. So you need to take care of that yourself manually!
The Database attach upgrade process does not captures all these automatically as in case of In Place upgrade process. You will have to manually do it all once again.

My this post is focussed in only letting the users understand the two basic approaches for Sharepoint 2007 to 2010 upgrade and to know when to use each of these approaches. The details on 'step by step' guide for upgrade is already provided in the Microsoft group of sites based on version of SQL server you use.
Please find the link on how to upgrade here --> http://technet.microsoft.com/en-us/library/cc263299.aspx

If you have a large server farm with mixed h/w configurations, then you should mix the above two Upgrade strategies and apply each of them on respective machine as required to make a mixed approach. The Microsoft people call Hybrid strategies.

More details on same below:




Wishing you all smooth and successful upgrades!




Sunday, May 16, 2010

Windows PowerShell 2.0 for SharePoint 2010

With the release of the Microsoft's latest range of 2010 products most from the Microsoft's Office suite like Office, Visio, Project and SharePoint also comes a powerful utility - PowerShell.

Power Shell packs the power of a 'good', if not great shell environment where an administrator can do a variety of tasks which can add more power and control to his tasks.
Power Shell combines the capabilities of cmd.exe and it's new features to make it a great tool for administrators. Also the SharePoint administrator's favorite utility stsadm.exe is replaced in the 2010 version with the same.

PoweShell 2.0 comes with WMI pack and includes the power of .Net framework which gives an admin the power to use those 'classes' to do a specific job. These classes can be accessed by an admin using 'cmdlets'. It combines & includes the 'if' conditioning statements, looping with 'for' and 'While'. It can be used to access any item from Windows registry for example, and perform operations over it, as required. All these gives me the sense of similarity that a Linux or Unix admin enjoys while working in their favorite shell environments.

Some Examples:

1. PS> Get-SPFarm | Select *

2. PS> Get-SPWebApplication


3. PS>Remove-SPSite http://SampleSite


Microsoft not only brings this tool to increase the efficiency of an Administrator but also adds powers to the people who are into this profession!

Sunday, April 11, 2010

SEO comes inbuilt in IIS 7.0

Building a successful site is what any entrepreneur, site owner or a webmaster would like. Search Engine Optimization has grown over one such science which has a direct relation to the financial success of a website. This is now being identified by all the major big players in the www - Google, Yahoo, Microsoft, IBM to name a few.

For the better performance of the websites on the search engines it is one of the best practice nowadays to do SEO. Microsoft who keeps innovating and keeps coming up with newer changes (improved) has now launched IIS 7.0 which comes with inbuilt SEO tools - Search Engine optimization Toolkit. That can help an administrator or a webmaster to do SEO fine tuning of the site.

For more details check out the embedded video in the below link:


Saturday, February 27, 2010

Tracking what the Community talks about you!!!

Organizations which are popular and have a good brand presence for say the products and or services they offer, also have a good presence online. Media companies or be it individuals they will definitely talk about the company for their offerings for the respective consumers. A positive image helps build the brand and negative does exactly the opposite.

It is therefore very important for the organizations to build a positive image and save it from being pulled down by the competitors . A similar example happened with a Electric Car company called REVA, the owners and senior management were too worried to know that some competitor has taken a crash test video and pulled to out of proportion to warn people not to drive this car as it is not safe!! Within no time after a YouTube video was released and comments were published people all across the forums, blogs, community spaces started getting to know about the same and it directly hit the sales figures of the car manufacturer.

'Better late than never' the REVA guys realized that it is best to directly confront their customers and public about the car. They came up published press releases, went back to community sites, forums , blogs chats and updated that the company is serious about this and is trying to work on making things better. Soon, the panic in the general public came to an end and the sales figures of the car company stabilized.

Lucky enough... Because many organization are not able to survive such attacks by the community and they loose the image brand name forever almost!

Had the REVA guys realized and tracked this negative vibe earlier they wouldn't have been in such a situation and would have also battled out the losses due to dropping sales figure and demand for the car in the market.
Imagine yourself, your product, your brand name or the business in this scenario!!

Do we have any such way though which we can track this and take necessary precautions and correct the image before it is ruined by someone else?

Yes of course! you can do that!! it is by using a tool which is like a search engine + analytics system that gives precise statistics of vibes created about your brand in the www. it searches all community spaces in the web for the most relavent search term provided by you that could be associated with the product your organization name. It lists all those places where you are mentioned and an analysis report how people talk about, rate related articles, express views and comments, how do other bloggers think about your product, what is opinion of users about your product in the community sites etc.

This tool is SocialMention.com

Use this to proactively stay steps ahead of the miscreants and save your brand image from getting tarnished by anyone else out there...

Saturday, December 19, 2009

IIS Crash and Hang in SharePoint Websites

My this post truly relates to the people who are suffering with the IIS Crash/Hangs. You know IIS Processes have hanged or crashed when you see lot of Warning messages in the System Logs. The messages descirbes as "The process serving Application Pool "sharepoint_site" failed to respond. The Process Id is "xxxx". " If a series of such errors appear in your System Logs you should use some standard debugging tool to find out what is happening inside.

Basically an Application Pool is a verrtual space where all the requests receved in the server are pooled. The Processor based on it's processing algorithm serves all these request made from the client side computer. Untill then all these processes wait in Application pool. In order to serve any particluar query faster (to enable faster response time) a server uses memory from the buffer or the cache. But this buffer or the cache has a certain memory limit. All this memory allocated in the cache/buffer is in the form of Heap. Heap in particularly is not associated with any index and therefore when too many requests come and if the application is not designed properly the heaps starts overflowing or even fragmenting. This fragmenting in general happens when short term and long term heaps are stored and are not cleared from the application memory.

Now an old school thoery which has very much application in practical. Always free your obects when there use is over. Many developers think that the Sharepoint code comes under the managed code and therefore their is no need to dispose them! This is a mistake that they do because even though the code is from Managed Code the specific objects like the SPWeb and SPSite make reference to the unmanged portion of code which needs to be destroyed using IDisposable Interface.

Once a developer makes sure that all the code is disposed properly he should deploy the same in the server. To check the same he can use a standard tool my Microsoft --> "Dispose Check".

Confirm the same after deploying your modules from the 12 Hive logs.

Thursday, November 12, 2009

Allocation of Database Size for SharePoint Applications

I wanted to share my new finding related to allocating correct available space for SharePoint database.

I was unable to see a better load time of pages when the available database size was less than 30% of the current level.

It was only after I made it 30% auto growth for all the db size in Share Point server I was able to see normal loading of the pages. To make it simpler I give an example here. If your current database size is say 10, 000MB then the availble space should be 10,000MB + 10,000 * 30% of 10,000 which truns out to be 13000MB. Aftre this if you set the autogrwoth property of the database size as 30% then you will experince that the pages are loading faster than before.

Though it was trial and error method through which I learned and took much time to do the same that is the reason I want to share this as it might save a good amount of time for you or just simply avoid this scenarios. The performance impact was immediate and pages where also rendering faster than before.

Monday, October 26, 2009

Building a strong online presence before you start product Promotion & Advertisement

Hi All!!
Before you go out and start marketing your product and services over the world wide web you should make sure that you already have good presence in there. Because when you first start posting or writing about your products or services, a large number of people start searching your name (company's) over the web. To let them know and understand about your brand, your organization, values, vision and products/services offered, you need to create an impression about your organization that helps build your brand.

Do you know where do they look in for you in the web, to know that you are something?!!

These are most probably prominent search engines, web directories, social media space, forums, blogs, wikipedia, criaglists... and the list goes on. Obviously they may land up in any of these above before taking one more step towards you.

But do not forget that, no matter the long this list may be but you need to be present in all these popular medias before you begin the real promotions and marketing for your brand.
After your presence is recognized at all these considerable channels of communication through which people usually connect and rely for trustworthy information you start leaving your competition behind. With this presence you are now ready for a solid head start, and now when your users learn about you through any of your advertisement say an email campaign they realize that this product or brand is well known and exist in most of the respectable digital world channels they know. You will realize that they are going to trust more on your brand rather than any other .

This little preparation which I call web-readiness will add great value to your efforts on web marketing and online promotions. Do remember to keep a track of all your channels through which you communicate with your users. Work out a metric which will help you to understand your customers that directly impacts your marketing efforts. Believe me that when you do an analysis out of all this valuable data you will get some really valuable feedback and insights into building, re-modelling and refining your business and marketing strategy which will propel you to greater heights much above you would have been!

Now your focus should be on building relationship with your customers and once you are able to do that, you experience a loyalty and trust following up for your brand among your customers. Once this is made they will be able to stick to you and wait for your products to hit the market.

This is a simple and solid way of developing a customer following for your brand. Only you need to be very careful in choosing the content and the channel for communication.

Saturday, October 03, 2009

Performance Improvement of Sharepoint Sites

Anyone who is concerned with managing Sharepoint server/sites is well aware that performance management is major concern for Site administrators or managers. This is basically because all the data including the pages in sharepoint server is rendered from the database itself. Because of this each time a request is made more load on the system is incurred as the request is furnished by the database.
There are various suggestions from the Microsoft Sharepoint team. I have customized and used few which worked for me based on the demands of my current implementation and scope of sharepoint 2007.
Rather than going in much details of each workarounds I would like to present them in a simple and straightforward manner so that it becomes easier for all types of audiences to understand.
  1. Reducing the size of the pages - This helps in reducing the load time of the pages especially when they are accessed over the Internet. This will involve working with the Master Pages.
  2. Improving the Farm Architecture - Increasing the number of servers; web front servers, application servers and database servers for proper request response. Load balancing the web front servers in case when the more numbers of concurrent users in the site.
  3. HTTP GZip Compression - Enabling the static (and dynamic as applicable) compression of the page sharepoint site pages using IIS HTTP GZip compression. This works like zipping of your usual office files. It compresses the files before getting released from the server and is then uncompressed at the users end so the response time of the pages becomes faster. You need to analyze the impact of CPU utilization in the concerned servers.
  4. Site Output Caching Strategy - Decide and enable site level output cache which enables the repeated content of the of the pages from being accessed from the server.
  5. Monitoring CPU and Server Logs and User Analytics - Properly managing the server by checking the sharepoint logs, event logs to fix the issues and avoid performance hiccups to the users will ensure proper functioning of the server.

After employing all these measures we experienced a considerable impact on the performance of the portal. The render time, response time and average latency reduced considerably.

I hope when you combine the above with proper backup strategy to backup you site configurations and valuable data timely you will be in a safer position.

Saturday, May 23, 2009

Microsoft warns for SharePoint 2007 SP2 bug

For all those users who are using Microsoft SharePoint Server 2007 should know that there's a bug update by Microsoft. This bug is applicable to the users who are using sharepoint server 2007 product with Service Pack 2 update.

Though Microsoft has confirmed that the bug does not harm any data of your application but only effects the expiration state of the software. Aftre 180 days of installing the SP2 the software expires and renders unusable to the end user.

There is manual fix available to this problem and the company is also working on to close this issue by developing the Hotfix to this problem. Though the next software update is suppose to solve all these issues automatically.

Find more details here --> http://support.microsoft.com/kb/971620

Saturday, April 11, 2009

My Idea of building Language Translator

I am bit sad today... why...?
I saw Gmail with language translation feature. It is like you choose your language based on the country you have chosen during creating your Gmail account(I guess). After you choose your language you what you need to do is to start typing. Begin to type the words of your language in English. Type in English as you would exactly pronounce the word in your language and just press the pace bar you will see that word getting converted into the the actual language word chosen by you.
Great isn't it!!! and simple too!!

Now why I am sad??

I am sad because in one of the organizations where I have been, I gave the idea and a technical implementation detail of building the language translator, Bad that it was rejected. If it would have been successful we would have been one of the first guys to do the same thing. And you know it was for the mobile phone...

Though my idea of translation was quite different from the Google's Gmail team as I thought of doing it a voice based language translation.

Hope I could do something easier and better...

Wednesday, April 08, 2009

Build Spam control functionality to your site using Captchas

Do you own a website? Are you working on a website?? Do you want to make your website spam free? If any of the answers of the above is true you may consider using a captcha service which is open source.

We use captcha when we want to avoid unnecessary requests to the server through the web forms open to the users which can send any data to the server. There are too many scripts, programs which are popularly known as automated bots that can send thousands and millions of unwanted data to the server using those forms.

In order to avoid those forms being utilized by these bots you need to build some validation before the form's data is posted to the server. So the application knows that the this data which is being sent to server is sent by a human and not a bot!!

To do this you can put some images which are being displayed to the users randomly each time the form page is accessed. In these images you need to show some alphanumeric characters and ask the user to type it down in the text placed adjacent to this captcha code. If the data entered by the users is correct based on the captcha then you can be sure that this is a human and not a bot as a program will not be able to make out the numbers displayed in the image.


Try to build this and you could be caught in a heavy duty task.

Now, the Carnegie Mellon University comes with a service "reCaptcha" which enables the the developers to implement this services by simple modification in the code provided in context to your application's technology asp.net, php, java, perl etc.

Each time a user comes to fill up the form he will type the captcha and your code in your web page will connect to the reCaptcha server which will in turn validate the user's authenticity.

Check this out at www.recaptcha.net

Sunday, December 28, 2008

Increasing the DB Size in SQl Server 2005

Hi again, I am going to write a small post on one of the common issues related with databases. I am talking about SQL Server 2005 databases. This is of course not a bug and is only related to tweaking server and application's performance.
By default during the time of database creation a default size of the database is associated. But this db size is not sufficient to hold the data long enough onto it. As you see things working good at the business end, you will experience data growth in your databases. By default, all db's are set with an auto growth property which avoids any data loss or application failure.

But this is not enough!!

In my personal experience, the db size growed enough and continued growing on extra 1MB automatically after reaching maximum default size. It affected my application's performance badly and made my portal slower than anything! During my investigation I checked the Db size and increased it to some good level manually.

This is what I actually did:

1. Opened the SQL management studio 2005
2. Right Clicked the database name - ex. Mx_Content_DB
3. Selected Properties.
4. On the Files menu I increased the initial size of the database
by 1GB and set the auto growth to increase by 10%.

After the database server restart, things stared looking fine and performance level increased considerably.

Please note that when you are using multiple db's in your application, you may have to do the same with other Db's also if
the same issue happens with them.

Tomorrow I will update the same post with the some screenshots which will help you further on your database management activities.

Saturday, December 20, 2008

Make your sharepoint data loss proof - Backup Strategies

Sharepoint portal is Microsoft's business class portal framework which is also a lucative option as an intranet portal because it supports and is build around the Microsoft Office suite.

Sharepoint is a well Mircorsoft's portal development framework or package for portal development. All the entities of a sharepoint site like lists, documents, pages, links, themes and site information including their data almost everything is stored in a SQL Server 2005.

For managing content and having a solid plan to make your valueable data loss proof, it is important that you follow 3 way backup strategy.

1. SQL Server 2005 DB Level backups.
2. Sharepoint site backups from the Shgarepoint Central Admin.
3. Code level backups. (if you are using your custom user-controls
for highly customized experience in the Sharepoint)

Though there is popular tool called Aveo's to backup the deep level sharepoint entities like list columns, versions, document library and list or list items.

Again, I would like to empahises that the DB level backups should be based on the follwoing approach.
a) Weekly Full backups.
b) Differential backups alternate days.
c) Daily transaction log backups.

Sharepoint Server level backups can be planned as below:
a) Farm Level backups.
b) Site collection level backups.
c) Web application level backups
d) Site level backups

Code Level backups include backing up the Virtual Directories before each time you want to deploy the latest changes in the Server.

I am sure when you carry out this bakcup strategy you will be in a better state to secure your data and recover easily and effectively during the backups.

Thursday, October 30, 2008

Google offers support OpenID

There are so many websites, portals and other web application which ask for you username and password in order to access their products/services. It is really difficult for any person (getting more diffcult) to remember so many login information. I have personally lost few login credentials for some websites quite a few times. If you also face such challenge to remember and recall the correct login id for your favourite websites then OpenId is for your rescue.

OpenID is simple notion for opend source id using which you can register once and use the same user id at a number of websites. Currently some of most populat brands in the www support OpenId.

Google has just announced that it will also support OpenId 2.0 based credentials to access most of it's webistes and webapplications. With Microsoft, Yahoo and many other key players already nodding and giving clearance to users with OpenId to access their websites is great move towards user freidnliess and provide a better managability to the user records and saving of enough user database and maintenenace related activites for that without compromising on the crucial data related to each user based on which the specific features to users are provided.

You can also register your id for free with OpenID and be the part of the current trend of create once use for a lifetime user logins.

Saturday, October 11, 2008

Google RSS news - A reality coming soon!!

Google now promises to add RSS links to it's search results. This is a great news for the people who find it difficult searching relevent information everyday on related topics. Now, you have the power of RSS with the Google Search to get your track on the latest news on products, compaines, persons, places and many more...

Google has recently confirmed this news to organizations who authorize in search engine news. 'Search Engine Watch' is one such organization who claimed to have got this news directly from Google. They have also informed that Google is the only major search engine who is bringing this new feature. I hope after Google all the major search engines will also include the same if they don't lag befind in providing latest info.

Google decision to bring the RSS links for search brings confirmation to us that the Google's promise to make it's products and services more user freindly is very true. As you are aware that RSS technology brings the most updated information about any given topic from the authority sites, now the the search results in Google will be more updated and lastest in the content also.

So all the best for future Googling and enjoy your Sunday !!

Thursday, October 09, 2008

SaaS - Do you know?

There are so many researches going on the labs for developing a model which will bring business and technology together like never before. That would close the limitations related to technology, cost and operations within the orhnizations. Out of the various software models that have come up one such model is SaaS.

SaaS - Software as a Service.

SaaS is all about developing a software (probably web based) and using it's certain features to provide it's services to others. It is using software as a service. For example Amazon is known for it's strong software model and business success. It is proven by the success and of Amazon itself. Now any other guy with an idea of creating a similar shopping site he may have to create a software and a suitable model from the scratch. Alternatively, he can also use the proven and successful Amazon's model as a core to develop his shopping site. Amazon has provied it's API for developers and business organizations who want to build a similar thing. They can use Amzon's API and build there own site leveragining on the Amazon's stantard core.

SaaS is basically using a software which is provided by it's host to clients to use this, without having to install this software in their machines. So now, the clients pay the software provider for the service that they receive for using the software. It can be in the form of a licensce for using the software for a given period of time. One simple example is the Mail service provided by Google. For organizations, ranging from startups to big size they can use the mail service provided by Gmail for their office use. They will be using the Gmail mail servers for all the emailing needs within the organizations. For this, these organisations save the cost of hardware and software setup, power, security and human resource required to maintain the same. Instead, now they can recevie all these services directly from Gmail. Plus they enjoy the world class performance, usability, popularity, security and hassle free business solution.

SaaS holds a great promise for the way business is going to change or evolve in the coming days with technology playing the forerunner as the key to success of the organization and their business together.

Sunday, October 05, 2008

Google Book Search Intergration Feature - API

Hey Guys!!
I am back after a small stint. To let you know as the first hand info quite few interesting things happened in my life both personally and professionally. I have changed place, my job, my company and there's change in my work profile too. Apart from these good chnages it's also sad that I am away from my very near and dear friends of Hyderbad but now the good thing is that I won't give you any chance to complain about my posts in this blog. This is because I have bought a new laptop today and I am writing this post from my new laptop.

Enough of scribbling and let's get into what I have say in my current post!!

This is about the new Google API for books.
As you might have guessed that this API lets you display books on topic of your preference. This is exactly what you can do using Google Book Search API. Now, I think this would be very useful for people who want to buy new books or review a book before actually buying it. I think this is great tool provided by people at Google. As far as portals or education based websites are concerned people are really going to benifit from it.

Google Book search can be integrated to a webiste using an API which offers higher customization. As an option for those who are not coding guys can take the copy-paste kind of javascript free to use code offered by Google to integrate in your web page where you want to put this book search and preview display feature.

The URI for the same is: code.google.com/apis/books/

Check it out, generate your code .. put it in your webiste ...and get going!!

Wednesday, January 02, 2008

Web OS - Building an OS above the existing Internet Layer


Of course everyone who uses computers knows what importance does an operating system holds in making a computer a usable piece of engineering. Without an OS a computer would be very difficult to be used for any task by a common man.

The Operating system provides a layer above the internal hardware of your computer and you. It acts as an interface between you and computer machine so that your commands can be easily understood by your computer. The OS provides a platform to run various application on your computer which otherwise would have been impossible in your computer to run. The other way of considering the same scenario is that if you don't have a standard operating system in your computer then the application that you are using right now in your computer need to have the system software capabilities. This would require all the applications to be almost an operating system in order to run in your computer.

Just imagine, how difficult it would have been for the people to develop applications then? Each application needs to have a layer of codes written that would interact with the hardware of your PC. If you have got an Operating system explicitly to interact with the hardware of your computer then developers don't need to build these human machine interfacing feature in their application. SO now, a lot of time, energy and resources are saved when you have an OS sitting on top of your computer hardware to have all other applications site over itself.

Now you only concentrate on developing the core functionality of your application and not on the OS capabilities to make it run on any PC. If you can understand this concept you can understand the Web OS - Web Operating System concept also. Actually, the Web OS is a platform to run some similar kind of application on the browser. If you try to understand this concept with a developer's perspective you will know what I mean.

If you want to develop a new shopping site you can use the framework of tried , tested and already industry standard application like Amazon. You can use the framework of Amazon shopping site to have a website build over it which would have a solid architecture and you would not need to focus on the developing your shopping site from the scratch. So now, a layer above the www sits, where people who want to build Shopping site can have the API's and framework of Amazon to develop their custom sites. Same thing can be done for search engines using Google's APIs, portals using Drupal's framework, social networking sites using facebook framework and likewise the whole new application would use the industry standard application's framework or APIs to have their business supported by a solid technical architecture imposed by such proven and industry standard application saving the new business's valuable time, energy and money.

Friday, November 23, 2007

How to get your Site Indexed by Google within few days ?!

You have created a great website and have been wondering when people would get to know your website. You want your SEO for your website to work and you want to let Google let you know that you have such useful content and quality content. What you are gonna do? First things first and always first!!

To make your site rank in the search engine especially Google, you should have the web pages of your website indexed in the Google's database. Any guesses how you do that..? Is it going to Google's Directory or Dmoz's Directory to submit your webiste..???
If your answer to these questions is "yes" then ummm... probably you are right but Of course you have failed to hit the Bull's Eye!!!

If you want your website to indexed in Google with proper information about each and every page i.e. their metadata you need to use Google's Webmaster Tool. Oh.. No need to worry that's absolutely free, truly in Google's style!!

First of all you need to create an XML Sitemap of your website. This XML site map will contain the list of all your url's in your website. Along with the URL your xml site map should also have some information about what these urls (web pages) contain. Then you have to include some Metadata of your web pages, these are title and description of each and every page. This should be unique. Say if you have a website which has 1000 pages. huh ?!! Is it possible to create xml tags 1000 plus xml tags for each page including title, description and url ? Of course not. That is why you would use tools. Tools that create xml site map for you as soon as you copy paste the home page url of your website. XMLSitemaps.com is one such useful website to create XML sitemaps of your website.

Now rather than writing in paragraph I would like to point wise so that it become easier for you to follow it.

1. Download sitemap.xml in your desktop and upload it to your server where you have kept your home page of the website.

2. Now you should be able to see the xml sitemap by typing url www.mysite.com/sitemap.xml

3. Open another webiste www.google.com/webmasters/sitemaps . Now sign in using your Google Account username and password. you will find a webpage showing dashboard.

4. In the dashboard below sites enter the url of your website and press "Add Site" button. Your site will be visible underneath. On the right hand side you will find a clolumns sitemap.

5. Under this column click "Add" link. A nwe Page will open up , under choose tupe list box choose general sitemap.

6. Below a text box will appear where you enter the sitemap url --> www.mysite.com/sitemap.xml and click "Add General Sitemap" button.

7. Next page a code will appear prepare a plain HTML page and name it as the code provided. Then finally upload the page in in the same directory of your web server where you have placed your home page.

8. Now when you do all these your site xml data is uploaded in the Google. You may like to verify the sitemap.xml file for error free xml codes which Google takes very strictly.

9. After the verification is over open Google.com after 48 hrs. and type in the search box--> site:http://www.mysite.com and see the listing. You will see the pages of your website indexed. Count the no. of pages with the list to confirm the all pages have been indexed properly !!!

Genrally it will !!

If you don't see within 48 hrs. you may check the updates in the
www.google.com/webmasters/sitemaps about the status. Genrally it takes 2 - 5 days for Google to do the same.