search suggestions
Multiple experiments, each with diverse groups of 300 subjects from throughout the U.S., were conducted to determine whether search suggestions (sometimes called "autocomplete" suggestions) have the potential to impact people's searches.

The first automated search suggestion system was introduced as an opt-in tool by Google, Inc. in 2004 to accelerate the search process. In 2008, the tool became mandatory, and, in recent years, the number of suggestions has been reduced from 10 to 4 or fewer, with Google officials acknowledging the company actively censors suggestions.

In the first experiment, conducted shortly before the 2016 presidential election, subjects were shown four sets of search suggestions. Two showed search suggestions related to the Republican nominee for vice president, and two showed search suggestions related to the Democratic nominee.

For each search, subjects could select one of four search suggestions or could type their own search term. Each pair of searches (one pair for the Republican nominee, one for the Democratic nominee) was identical except that in one of the searches, one of the search suggestions was negative (e.g., "Tim Kaine scandal"); all other items in all the searches were either neutral or positive.

Consistent with research on negativity bias, the negative items attracted about 40% of clicks- twice as many as one would expect by chance. Remarkably, people clicked the negative items about 5 times as often as they clicked the corresponding neutral items in the control questions, and people who were undecided about their candidate choice clicked the negative items more than 10 times as often as the neutral items.

Consistent with research on confirmation bias, people affiliated with one political party selected the negative suggestion for the candidate from their own party less frequently than the negative suggestion for the other candidate. The second experiment, in which a negative search term was paired with four different neutral terms, largely confirmed these findings and also showed the varying extent to which different neutral terms can compete with a negative one.

It also demonstrated an order effect: The higher the negative search term is in the list of terms, the more clicks it attracted:
search suggestions
A third experiment confirmed the previous findings but also controlled for the frequency and arousal levels of the terms. It demonstrated that a low-valence (that is, highly negative) term such as "suicide" attracted more clicks than moderate- or high-valence terms such as "fashion" or "holiday":

click valence
Finally, a fourth experiment showed that maximum control over a user's choice was exerted when four search suggestions were displayed.

When the goal is to make sure that a user clicks on an offered suggestion rather than completing his or her own search term (presumably to direct traffic in ways that produce the most benefit for the search engine company), control is maximized when two opposing tendencies overlap.

Adding more search suggestions increases the likelihood that people will click on one of them (red line in the figure below) but also dilutes the effectiveness of a negative term (blue line below). With four suggestions, the power of the negative term and the likelihood of clicking on a search suggestion are simultaneously maximized:
search control
Our research also confirms that on computers (but not necessarily on mobile devices), Google usually shows four search terms:
search results
The results of these experiments suggest that differential suppression of negative search terms for one candidate will "nudge" voters toward web pages that portray that candidate positively while exposing people searching for information about the opposing candidate to web pages that portray the opposing candidate negatively.

Differential suppression of negative search terms-a technique that some commentators claim Google used to support Hillary Clinton before the 2016 election in the US-appears to be a powerful way of influencing both voting preferences and, perhaps, people's opinions in general.