Is it Safe to Apply Website Pages in Raw PHP Files? Hypertext Preprocessor (PHP)

I was contacted by someone that wants to work with me, but ONLY wants this type of Search Engine Optimization (SEO).
He worked with a freelancer and got "page 1 rankings in 2 days". Obvious red flags, but he sent me the sitemap of what was done and it seems a bit strange to me that it would work at all – let alone within 2 days.
This image is a sitemap screenshot, and the only information I can find different is the H1 at the top of the page, but it all leads to the same URL.
I'm not really interested in working with the guy, but what is the deal with several .PHP files for the same URL with just a different H1?


is it safe to apply website pages in raw php files hypertext preprocessor php

36 👍🏽36 75 💬🗨

So let me get this straight… you are taking someone else's work and posting it in a public forum and asking how it works? With zero intentions of working with the individual. Remind me to never work with someone like you

Justin » Roman
I mean he's not identifying the person. He just wants to see what type of tactic this is and if it's beneficial.
Roman – The problem with fuds like you is that you deter members from asking questions. The lad is trying to learn like we all are and you are trying to publicly shame him. He isn't stealing work, he is trying to understand it. Get your head out of your own arse mate.
Roman » Crossan
He is trying to learn at the expense of someone else. As he stated he is downplaying the value of the other person and stated he has no intention of working with him. It is the intent and the attitude that is the problem. This is why people get burned and don't share information. Sorry but you need to get off your high horse
Yeah this is so regular with Search Engine Optimization (SEO) – forensic crawling of competitor URL's. Its not even a thing. But the way you frame it makes it seem like the guy who posted here did something wrong and inappropriate. What exactly would be the problem posting this data here? I don't see any sort of substantiated problem here myself personally speaking. I have crawled hundreds of different URL's with spider software like screaming frog which thousands of people do the same thing. Its an industry standard thing. Did you even noticed the poster respected the privacy of the website owner and omitted the specific URL. You probably really jumped the gun here without cause.
As for this situation looks like auto generated spam pages to to try to rank for anything related to the root keyword. As another person mentioned that depending on the competition and the TYPE of ranking (this case local which is different than national and international), it would be super easy and super fast to rank. But rank what? In this case hyped spam nonsense for most part and energies would be better spent to research keywords that bring in ROI and use real SEO to rank for those searches – basically. Ah! Just my opinion for ya!
Roman » Jonathan
The other individual shared a piece of his work with this individual with the intent of working with him. The guy who made this post shares it in a public forum. This is a basic tenant of trust. This individual has no sense of integrity or values and thus cannot be trusted.
McLaughlan » Roman
The reason why I didn't give the URL. It's not a tactic I've seen so just trying to understand the concept behind it – isn't that what this group is all about?
I'm mainly not interested as I'm under the impression I could be inheriting someone else's mess, but I'm not totally sure what's going on.
That's cool, I can remind you not to work with me 🙂
Roman » McLaughlan
If you don't understand this basic stuff my friend then there is little value in working with you so don't sweat it. Because you don't understand it your devaluing the person, the tactics and labeling it a likely mess. My issue is your attitude towards things you don't understand.

The person who built this is doing it wrong. Not their core strategy or making lots of pages, but hitting keywords that basically get no search volume, and then positioning it to the client as "page 1 rankings in two days."
The concept is that you can put a page in a city, that represents a keyword. Each city has for that keyword has it's own with virtually identical content. Assuming your content and web page isn't overtly spammy, Google doesn't penalize this type of strategy.
My opinion as to why they don't penalize it is that regardless of the city, the user is getting what they expected from the web page. The user searches PC repair, they get a page related to PC repair, they call and get a PC repair company on the line -> They got what they wanted and everyone is happy.
In this case, the noob that executed this strategy should be condensing the pages. "Best" "affordable" "best price" "guru" "pro" can all be condensed to one single page: PC Repair Buffalo. The way this is built currently, with dupe content and such precise keywords with basically no search volume, I would be concerned those pages would cannibalize each other.

McLaughlan » Colbert
Yeah, that makes total sense. Thanks for the solid explanation mate
Jeff » Colbert
100% agree. Is hilarious how some in here are saying showing this was unethical. Is a f*cking joke, when the only unethical aspect was charging 500 per month for crappy worthless messy out sourced noob work


The theory is that one is presenting content to exactly match a search phrase. I have done similar way back in the day and it did help to capture long tail searches for non competitive terms.
In this instance, they are barely doing any on page SEO – just changing H1. And given how many actual searches there are for "Buffalo area pc repair guru" it might actually give you a rank. I doubt more than 5 people a year would see it. Or it might well get overshadowed by any Buffalo repair search since it's a big city with competition. Now think of some tiny town without a pc shop…that might actually work but there may well be zero searches so who cares?
There are better ways to optimize ranking signals. But 10+ years ago sometimes this would get you decent wins.

McLaughlan » Travis
Yeah, it did strike me as strange the term 'guru' is being targeted – can't say I've ever used that in a search term before myself haha. Thanks mate

Side note… I actually talked to the person who did this on the phone once. He pitched a client of mine, offering 2 day page 1 rankings, etc. The guy is an older dude. He actually coined some widely used phrase (I forgot what phrase it was), and he'll tell you all about it if you call him up.
Basically he just calls up local businesses, sells them on page 1 rankings in two days for like $500 per month or something, does a page build like this one via File Transfer Protocol (FTP) using a group of Indians, and that's it. Not a bad hustle.
I know it's him because I found his link in the footer of this client's website. He's a nice guy but his strategy doesn't really drive any tangible results as executed (except for the promised page 1 rankings for a bunch of keywords that don't get any search volume). If he updated it to align with modern best practices, it would probably work out a lot better.

McCombie » Colbert
When op mentioned page 1 in 2 days my mind went right to it being something where he just ranked for a ton of nonsense with no search volume.
Colbert » McCombie
Haha! yeah man! I can rank in on page 1 in 24 hours for "Best basket weaving class at 4pm at the YMCA on 4th street in [insert podunk city]" 😂

Andrew 🎓
Why not just mass create, spin certain paragraphs and headings with custom spin directives, and then interlink pages based on proximity? Send all the links to the hub (/locations) afterwords.
I guess you could also redirect some of these smaller city pages over time to the larger city in the proximity after you let the page age a bit. Certainly not a 2 day thing but plenty of softwares exist that do this.
Side note.
I hate when people keep .php in their URLs.
For which reason does someone build .php pages now? When I see that in URL I clearly understand that website totally sucks

Why do you assume a website sucks if it's in PHP?
Illia » Toto
A lot of reasons for it:
1) easier for blackhatters to steal website traffic through canonicals change (won't explain the technology)
2) businesses with a budget razer will develop their website on React
3) no budget = you can easily outrank them

The glaring red flag from the extremely limited information (sitemap) is camel-case URLs. As a technical SEO, this makes me cringe and I would immediately dismiss this candidate. Fundamentals.
Now the assumption beyond that here is shallow, spun content used for local landing pages. Not ideal — however I have seen this work in the right circumstances — i.e. multiple physical locations. If this is merely service area "geo-fencing", there are more refined ways of implementing, in part with Google My Business (GMB) optimization.

McLaughlan » Jake
Yeah that makes sense mate. Cheers!
Sure thing.


These may satisfy you:
» Some Advice if your Server Get Hacked
» Using a CDN Affects SEO

Leave a Reply

Your email address will not be published. Required fields are marked *