For the past year or so, I’ve been using and enjoying the search engine Kagi. Its search results are…fine, no worse than others, and it’s ad-free, stated privacy as a primary goal, and seemed to have a better ethical sense than its competitors.
Or so I hoped.
1/
Kagi recently started using some services provided by Brave, a company run by immensely objectionable people. Kagi community members rightly raised concerns about this.
I was curious to see Kagi’s response. This is a tricky question that requires a thoughtful, careful response. Their response would be telling: not just about the question of Brave, but about their general ethical outlook.
2/
I’m sympathetic to Kagi’s dilemma. Brave may well provide useful services to them. And it is impossible to completely avoid engaging with people and companies who do harm in the world; that is our reality.
We can’t always disengage. What we •can• do, at a bare minimum, is think carefully about how we engage, and make wise decisions (as businesses and as individuals) that take into account our indirect impact on the world.
Again, these community concerns merit a thoughtful response.
3/
What I found was _not_ a thoughtful, careful response. What I found was the founder of Kagi saying:
“Politics finding its way into tech is one of the reason we do not have innovation any more.”
https://kagifeedback.org/d/2808-reconsider-your-partnership-with-brave/5
Well shit. That is the reddest of red flags.
4/
@inthehands "We don't want politics in XYZ" is just another way of saying "we like the political situation as it is". Always. There is no "politically neutral" model of interaction. What we should promote is a *cosmopolitan* model, where people can express a variety of political views safely. Some views (e.g. those that deny others' rights including the right to their own opinion) should still have consequences, but most should be able to *coexist*.
@Obdurodon I mostly agree.
On top of that, I want a model where people actually think about their actions, and listen to others about those consequences.