Ask HN: Can we create a new internet where search engines are irrelevant?
Ask HN: Can we create a new internet where search engines are irrelevant?
Ask HN: Can we create a new internet where search engines are irrelevant?
The only way I can imagine this working is by twisting the definition of the words "search engine" enough that you can claim that there aren't search engines, but really there are still, just under a different name.
Search engines aren't actually the "problem" that OP is wanting to address, here, though. He just doesn't like the specific search engines that actually exist right now. What he should really be asking is how a search engine could be implemented that doesn't have the particular flaws that he's bothered by.
Plus the web is not the whole internet.
You could stick to Gopher.
Or use other search engines. There are hundreds. Hundreds.
Maybe not as useful as the dominant ones, though.
I have a very difficult time imagining an internet that is both interoperable and ranking-free. Now, that having been said, we are well outside my area of expertise here so I'd love to hear from folks who know more than me.
What about just giving transparency to what the ranking is and letting people control it? Analogous to "sort by new/best/top" bit ideally with more knobs to tweak and a bunch of preset options?
Then it’s just more easily abused by SEO. “Best” according to who? Votes? Number of views? Page rank? All numbers can be manipulated.
Yeah, please only include lightweight pages please, with short texts. For example.
That could work, I suppose, but I do wonder how much it would slow everything down.
Still, its fun to imagine what it might look like if only......
I think the OP is looking for an answer to the problem of Google having a monopoly that gives them the power to make it impossible to be challenged. The cost to replicate their search service is just so astronomical that its basically impossible to replace them. Would the OP be satisfied if we could make cheaper components that all fit together to make a competing but decentralized search service? Breaking down the technical problems is just the first step, the basic concepts for me are:
Crawling -> Indexing -> Storing/host index -> Ranking
All of them are expensive because the internet is massive! If each of these were isolated but still interoperable then we get some interesting possibilities: Basically you could have many smaller specialized companies that can focus on better ranking algorithms for example.
Sigh enough daydreaming already........
Not sure how that can implemented, but I’m sure it will only lead to great amounts of SEO abuse. It only works if everybody are acting in good faith.
Would need human curation to select the best websites in each field.
Yahoo back in the day with its categories, and later Fazed.net with curated links was a nice time for a while
Pay to play was the problem there. I had the highest ranking joke page on webcrawler for a stint, but Yahoo wanted $500 to put me on top. My 15 year old self was not interested.
That’s pretty much what all of the site aggregators were. I ran a couple of communities on yahoo and some other sites. There were also services like Archie, gopher, and wais, and I am pretty sure my Usenet client had some searching on it (it might have been emacs - I can’t remember anymore). I remember when Google debuted on Stanford.edu/google and realized that everything was about to change.
It worked because the web was much smaller.
Or AI to rank and filter out the things you need based on public indexing. Preferably there'd be several AI assistants to choose from. Things seem to be moving in that direction anyway.
The problem is that personalization of search results tends to information bubbles. That is the reason why I prefer DDG over Google.
Ai won't help since it'll be programmed to show only what it's owners want us to see
With your own customization, done locally.
Couldn't metasearch engines like metager help with this?