Hello, I have an idea and I wonder if it can happen. I'm a webdesigner, and when I make a new site for a client, the indexed Google links from the old site should be redirected to the links from the new site so they will not break. Sometimes it's about thousands of links and it's a lot of work to do if it's done by hand. Can you create a bot that crawls old links and creates redirects to new ones in .txt file. I give the following example: We have a product named "White Winter Hat" with link "https://oldsite.com/product/white-winter-hat.html" The same product on the new site has a link https://newsite.com/white-winter-hat". The bot work is to find the product link and create a new link according to the structure of the new site. The name of the product will be used as an indicator. In short, bot work is as follows:
Bot crawled the site. It finds a page called "White Winter Hat" with link (url) https://oldsite.com/product/white-winter-hat.html and then it creates the following link "https://newsite.com/white-winter-hat" and recorded it in .txt file. The bot should do this for every page in the old site.
The structure must be custom, because every site has it's own structure. Maybe there should be somekind of tags for domain, category, page/product name and etc ... and then I just order the tags to create the structure of the new links: domain/page (https://newsite.com/white-winter-hat).
Sorry it's so messy. It's just an idea I'm thinking right now. If someone have better one, I'm glad to hear it. And most important, is it possible to create a bot for this?
Thank you.