Google's 'Project Owl' to Improve Searching, Surfing & Safety

Last Update: August 09, 2019


I recently wrote a HOW TO blog on my HEN Affiliates website called The Simplest Ways to Survive Google’s Newest Algorithm Changes.

Google has been changing their algorithm quite often and most recently gave their advice with a number of questions for content writers. These questions pertain to the direction they want your website articles to go. One main reason why Google is so adament about high quality content--lately, in particular,-- is because of a huge issue they have been facing since last December.


There has been an increasing search quality problem that has resulted from fake news, disturbing answers and offensive search suggestions appearing at the top of the search results. So, in order to get a handle on it all, Google formed "Project Owl" to address these issues in hopes to put emphasis on authoritative content.

It's true that for years Google had issues like this, but unfortunately now it has gotten much worse. It goes beyond SPAM to content that is falsified and offensive. So now, Google is taking action.

This past February, Google began beta-testing and allowing people to report problematic search suggestions, and today, it is live for everyone worldwide. I will discuss this more below.


Google named this new approach to problematic content "Project Owl" for a distinct reason. An owl represents wisdom, the perfect symbol for this project. Google wants to bring back wisdom onto their platform.


This is a term that defines how people are producing mass content to serve their unruly opinions regardless of the actual facts. Furthermore, people are searching and finding this content in volumes---so much, that it is greatly influencing the search suggestions in offensive and dangerous ways. This poor-quality content has shown up for popular searches and it is completely fake news that people totally make up and fabricate.

I mean, can you imagine what this could lead to? Not trusting whether or not the articles we read on the net are true or not?


Google has gone to great measures to improve the autocomplete search suggestions. You know, when you begin to type a search and suggestions drop down to speed up your search. (like the Alphabet Soup method).

So now, these suggestions are leading people array and detouring them into problematic topics--Black topics. It reminds me of the dark web.

Now, a new Report inappropriate predictions” link will appear below the search box. When clicking that link, it brings up a form that people can select a prediction(s) on certain issues and report them.

These unsavory predictions can be reported as sexually explicit, violent, hateful or including dangerous and harmful activity. It also includes an “Other” category and comments are allowed. Sadly, Google has never before published reasons why something might be removed...until now.


You can leave your comments and input below, if you'd like.

Google hopes the feedback from this project will be helpful and useful so that this information can make algorithm changes that improve all search suggestions.

Hopefully, in my opinion, this project won't backfire, causing true high quality content to be pulled from the searches, mistaking it for problematic. They will have to get several of the same complaints probably to pull it.

It's too bad that there are people out there who abuse the system. But, Google is right on top of it! This really puts into perspective how important it is to write genuine, high quality content, and develop a trustworthy website! Now you know why Google ranks you up so high if you do!

If you wish, please LIKE and SHARE this "high quality content" to spread the news!

Erin :)!

Join the Discussion
Write something…
Recent messages
SimoninAsia Premium Plus Featured Comment
Personally this all makes me very wary. I don't trust huge corporations like Google to act as arbiters of what is true and untrue, offensive or inoffensive. Recent cases of Facebook, Twitter and YouTube discriminating against conservative voices point to how this could very easily lead to reporting people to silence opinions you simply disagree with.
herinnelson Premium
Yes, Simon, you state a good point. It could actually blow up to something that gets out of hand. It seems every action taken nowadays has to be handled delicately and tip toed around.

Erin :)!
GazBower Premium
I do agree something needs to be done about harmful, inappropriate and sexually explicit content. However, I agree that people should be able to search for whatever they want within reason. This opens up the possibility of content being shut down just because a small minority don't agree with it.
This is censorship and people being told what opinions they are allowed to hold is wrong.
Fake News works both ways
Global media companies shouldn't be in a position to decide what is true and what is false.
In light of this my hope is that Google don't take this too far. Sure they can highlight what they think is suspicious or harmful content but at the end of the day people should be able to make up their own minds.
Just my opinion by all means disagree if you like, that's the beauty of free speach.

herinnelson Premium
You pose some great arguments here, Gaz! People have the right to free speech. But, then again, some take that right too far where it hurts the common good. It's a very touchy situation that Google needs to handle with extreme care. Great thoughts, my dear! xo

Erin :)!
terrycarroll Premium
I applaud these changes and unlike Sherlock77 in the response below, I do not see them as being imposed in a Big Brother manner. The "Report inappropriate predictions" option is just that, an option.
You can either type through the predictive suggestions until you complete your search topic or just not report the problem. It is not compulsary but I see it as a way of allowing the searcher to override any Google "preferences".
I am all for it and it's about time Facebook and Twitter did a similar exercise.

Thanks for aharing, Erin

herinnelson Premium
Great take on the situation, Terry, and I can see your point. I think we should be given the "option" of reporting scandalous and inappropriate content as a way of eliminating it...if it is truly obvious spam. I also think that Google is just protecting their searches to make them appropriate and helpful for their searchers. Giving the option to report any suspicions is their way of both protecting their viewers, as well as themselves. We've lived by their rules in the past, and will live by them in the future!

Erin :)!
sherlock77 Premium
I actually think people should be allowed to search for whatever content they want to search for online (within reason). It shouldn't be up to Google to decide what people should view and what they shouldn't.

Google has its own agenda, and that agenda is NOT to give people the most relevant results for their search query these days.

With some types of searches, practically the same 10 corporate websites continually dominate the first page of results. Either that or half the first page is ads.

Google was invented as a way for people to be able to find websites and information online. Now they are so huge and powerful, they've decided they'll dictate what we're allowed to see and what we cannot.

They've become yet another "big brother" government department.
herinnelson Premium
You've proved some great insight here, Darren, and what you said tends to be true. I don't think Google is trying to "completely" control the searches that take place on their platform, but maybe eliminate "obvious" potentially harmful and hateful information. WHAT information is tagged harmful and hateful remains to be seen. There's a fine line here that they, of course, have the final say about, but giving people the "option" to report doesn't make them in "complete" control. It gives others a chance to voice their opinions, as well.

Thank you for your input!!

Erin :)!
JKulk1 Premium
They sound like much needed changes to me. Jim
herinnelson Premium
I definitely agree, Jim!

Erin :)!