Connect with us


Microsoft says Bing’s ‘Tank Man’ censorship was a human error



On Microsoft’s search engine Bing, searching for “Tank Man,” the iconic figure from the 1989 Tiananmen Square protests in China, turned up no image results in the United States for part of Friday. Vice was able to produce the same results in the US, and heard from multiple users from other countries who ran into the same issue.

Stranger still, searching “Tank Man” or “Tiananmen Square Tank Man” turned up normal search results in Bing, it’s just images that were mysteriously missing. When The Verge contacted Microsoft for an explanation, it said “This is due to an accidental human error and we are actively working to resolve this.”

Search results for “Tank Man” after the accident.

It’s an unfortunately timed accident given that June 4th, 2021 is the 32nd anniversary of the student-led protests in China — an uprising in response to changes in the country that was met with assault rifles, tanks, and a massacre. Microsoft did eventually restore results to the specific search, though it’s still noticeably missing the well-known image. Adding in a mention of “Tiananmen” or “Tiananmen Square” pulls up what you’d expect, however. It’s not clear why Bing would weigh generic images of tanks more heavily than a famous piece of visual history, but we’ve reached out to Microsoft see if that’s normal.

Search results for “Tank Man” after Microsoft addressed the issue.

Bing’s presence in China is somewhat complicated. The search engine disappeared from the country for nearly a day in 2019 seemingly because China Unicom, the state-owned telecom company, was ordered to block it, according to the Financial Times. Microsoft did not disclose the cause of the outage, but service was eventually restored.

Google’s experienced issues exploring a modified version of Google Search for China, but faced harsh pushback from employees and US regulators over how the product could impact users in the country.


Tech Moves: Portland’s Act-On adds two new execs; Tanium adds public sector SVP; and more



Gregg Ames, left, and Syed Ahmed, two recent hires at Act-On. (Act-On Photos)

—Act-On, a Portland, Ore.-based marketing automation platform, added Gregg Ames as chief commercial officer and Syed Ahmed as senior vice president of engineering.

Ames previously worked in sales at Marketo, Oracle and Turning.

Ahmed previously was CTO and vice president of engineering at Contiq.

Teddra Burgess. (Tanium Photo)

— Teddra Burgess has been added as Tanium’s senior vice president of the public sector. In the role, Burgess will support public sector organizations, implement IT management and cybersecurity at the Kirkland, Wash.-based cybersecurity and systems management company.

Burgess previously worked for Hewlett Packard, Micro Focus International, SAI Global, and ASG Technologies.

— Karen Brewer has become executive vice president and chief marketing officer of Fabric, an online commerce platform.

Brewer will manage the marketing organization. She most recently was a marketing advisor for the company. She previously was with Cisco, Ellucian and Adobe.

— Mel Sears was named president of HNTB’s Northwest division.

Sears will manage 400 employees for the division and oversee infrastructure programs for highways, rail and transit, and more.

Sears joined HNTB, an engineering and architecture firm earlier this year as a Western Region sales officer.

Jeremy Showalter. (Jeremy Showalter Photo)

— Jeremy Showalter, a Microsoft alum, and Andrew Cho have launched Weave Savings, a fintech focused on savings groups.

Showalter stepped down as CEO of Pique last month, following the acquisition of the AI startup’s intellectual property by Vietnamese payment app MoMo.

Cho, who will serve as Weave Savings’ CPO, was previously director for Tottini Discovery.

Continue Reading


Fresh Paint Or Patina Of Ages, That’s The Antique Question



The world of antique furniture and the world of hackers rarely coincide, and perhaps the allure of the latest tech is greater for most of us than that of a Chipendale cabinet. But there are times when there are analagous situations in both worlds, so it’s worth taking a moment to consider something.

This late-17th-century dressing box would not be of such value or interest were a restoration to strip it of its patina. Daderot, CC0.
This late-17th-century dressing box would not be of such value or interest were a restoration to strip it of its patina. Daderot, CC0.

Antique furniture has survived for hundreds of years before being owned by today’s collectors. Along the way it picks up bumps and scrapes, wear, and even the occasional repair. Valuable pieces turn up all the time, having been discovered in dusty attics, cowsheds, basements, and all sorts of places where they may have been misused in ways that might horrify those who later pay big money for them. Thus there is a whole industry of craft workers in the field of furniture restoration whose speciality lies in turning the wreck of a piece of furniture into a valuable antique for the showroom.

The parallel in our community if you hadn’t already guessed, can be found in the world of retrocomputers. They are the antiques we prize, they come to us after being abused by kids and then left to languish in a box of junk somewhere. Their capacitors are leaking, their cases may be cracked or dirty, and they often possess the signature look of old ABS mouldings, their characteristic yellowing. This is caused by the gradual release of small quantities of bromine as the fire retardant contained within the plastic degrades under UV light, and causes considerable consternation among some retrocomputing enthusiasts. Considerable effort goes into mitigating it, with the favourite technique involving so-called Retr0bright recipes that use hydrogen peroxide to bleach away the colour.

Do We Lose Something In A Quest To Recreate Our Childhoods?

Is this any less a Macintosh because it shows its age? htomari, CC BY-SA 2.0.
Is this any less a Macintosh because it shows its age? htomari, CC BY-SA 2.0.

In the antique furniture world there are operators at all levels from the shysters pushing imitation furniture made last month in China to the specialist dealers in high-end genuine pieces. Antique restoration has strata to match, and at the quality end they do work to the highest possible standards.

Consider though, given a priceless antique that needs work, what is the objective? It would certainly be possible to return it to the same condition in which it left the cabinet maker’s workshop hundreds of years ago, but is that their aim? Instead they restore it to a very good condition but leave it with a patina of age. Shelves bow downwards slightly in the middle, there are slight marks under the polish, and the feet bear some of the scuffs they have picked up over the years. Over-restoration in which it looks too new just isn’t the thing, because then it ceases to look like the real thing that it is.

Spending a lot of time over the years around retrocomputers and retrocomputing enthusiasts, it’s interesting to make that comparison with antique furniture. Why do we not allow our antiques to wear with pride the patina acquired through the decades, and why do we prefer to pretend that it’s 1988 and they’ve just come out of the box? Is it because we’re really recreating our own childhoods (or perhaps those we wish we’d had) rather than appreciating the devices as relics in their own right?

With an increasing number of modern reproductions of classic cases and motherboards being produced, it seems to me that we’re blurring the line between the original and the reproduction just as an imitation furniture maker does to the genuine antique. Will we in time seek to differentiate our classic machines from the repro pretenders by the patina of age? Maybe it will be left to a future generation of retrocomputing collectors to make that jump.

Header: Mark Fosh, CC BY 2.0.

Continue Reading


Bits and Bytes and Data Delights | Hacker Noon



Podcast Hacker Noon profile picture


Tune in to Listen to Tech Stories from Hacker Noon 2-3 times a week!

Limarc Ambalina, Ellen Stevens, and Amy Tom chat about data privacy ☠️ Humans are in loooove with the internet, and data production is becoming more rampant and autonomous. How has this process affected our daily lives, and is it a good thing or a bad thing? Let’s find out. 😵

Listen to The HackerNoon Podcast on Apple PodcastsSpotify, or wherever you listen to your podcasts.


  • How do you control your digital footprint? 👣 (01:20)
  • Is downloading apps that aren’t from the Google Play Store safe? 🧐 (08:15)
  • AI might be better than people at spotting wildfires thanks to data 🤯 (14:57)
  • If there was an AI that could detect danger in public spaces, but first needed hours and hours of video of public spaces, is that okay? ⚖️ (18:49)
  • Breaking down the TikTok algorithm and its use of AI 👩‍🔬 (24:16)
  • What’s the problem with Bezos and space? Here goes Amy on another space war rant 😂 (34:45)



Limarc: [00:00:00] hello everyone. Welcome to another hacker noon podcast. I am Lee mark. Your stabbed in the host for the day. The gave me editor at hacker noon, VP of growth general gamer anime nerd. We’re also joined by Amy, Tom Nicholas Cage’s number one fan and our regular podcast, man. Hello, Amy. And we also have Ellen degenerates.

Now I’m just curious, Ellen Stevens our editorial assistant Xtrordinair and all things newsletter at hacker noon. Hello, Ellen. Hello. So this week on planet internet, we’re talking about data privacy and also the pros and cons of. The way that the data market is working in the world right now.

And we have a bunch of a hackathon articles talk about that. So let’s start with the first one, which is titled the most expensive things in life are free of charge. Protects your data by Zen Chan. What did you think of this one? 

Ellen: [00:01:05] I thought this was an excellent piece because they really go into discussing how you can manage your online footprint.

So all of the things that you’ve been sharing with different companies, different, log-ins, et cetera it really dives into what you can do to. Maybe remove some of that data from just arbitrarily existing in the world. And I think that’s very valuable in this day and age. I would have liked to see a little bit more focus on exactly what can happen.

When all that data is out there, because sometimes when you’re signing up for different emails, different programs, different things like that you are sharing quite a plethora of personal information. And so the more that you share that higher the chance of, one of those companies potentially getting hacked having security vulnerabilities and it really allows a hacker whomever to build quite a profile on a particular user.

That’s one thing that maybe I would have liked to see a little bit more of, but I think they they did a fantastic job with covering some of the suggestions for what you can do to. Minimize the data that’s already out there. So one of the things they suggest is to review your online accounts and clean them up.

So sometimes for example, maybe you sign up for an email forever ago. You no longer use it. So why not close that account? Or, some app that you sign up for? Why not? Why not, actually go back and clean that up too, to make sure that data is no longer potentially at risk, because if it’s an application you’re no longer using, maybe other people are also not using it.

And so the upkeep and the security that’s currently present in the application might be compromised. So that’s one of the things the other thing that 

Limarc: [00:03:09] w what would you explain as could you give a brief your definition of what’s a digital footprint? What would you say a digital footprint is.

Ellen: [00:03:18] I would say that it’s various amounts of data that are out online about a particular person. So I think things like name, state of birth and fitness data, phone numbers, credit card numbers your first pet name. I think it’s listed here. Things like that. Yeah. 

Limarc: [00:03:41] I just want to flex my Japanese knowledge.

This phrase actually directly translated means there’s nothing more expensive than free stuff. Just wanted to just want it to tell you all that I can read that. No big deal. Oh, 

Ellen: [00:03:56] we’re very impressed, 

Limarc: [00:03:59] Amy. So talking about all of these things you could do to minimize their digital footprint and reduce the amount of data you have online point blank.

Do you care enough to do any of this stuff? Why or why not? 

Amy: [00:04:11] It depends. If you think about the kind of data that is available online, like my name obviously don’t care. Something like my first pet’s name, maybe I care a little bit more bad because it’s like a security answer to a question.

Actually, a few months ago I went on a date and the guy asked me what my pet’s first name was. And I was like, I don’t know, relaxed.

Ellen: [00:04:37] Random. 

Amy: [00:04:38] Yes. And it depends on what the kind of data is, and yeah. Stuff like my fitness data, my weight, I don’t really care. But my credit card information, my social security number, my beds for his name, those are data points that I hold more with more value, it depends. Yeah. 

Limarc: [00:04:56] What about UL? And do you care enough to do any of these methods? They listed. 

Ellen: [00:05:02] I care profoundly, but I don’t always do them. One of, one of the things, one of the stories that I wanted to share with respect to this article is a couple of years ago, I received a very fascinating email in my core email that I use.

And the subject headline was one of my former passwords for that account. Yeah. And and I so you can do this thing. When you know, it’s a shady email where you view the source of the email and you don’t actually open the email because sometimes if you open the email there’s anyway. So I viewed the source of the email to see, and it essentially was someone trying to get more information from me, threatening me.

Stuff like that. So what had happened is that particular email service provider had a massive security breach. And so when things like that happen, your information gets posted on the dark web. And yeah, so they posted, my, my email and then the password associated with it. So thankfully by this point I had changed it a number of times, but I did keep it the same for quite a while.

So there’s a website that you can check, but I’m not. Remember what what it is 

Limarc: [00:06:16] called have I been or something like that? I think 

Ellen: [00:06:19] so. Yeah. Yeah. And 

Amy: [00:06:22] it shows that seems very like two thousands internet age. Yeah. 

Ellen: [00:06:28] Mostly miraculous thing though with that particular email that I had is there were multiple security breaches.

Yeah. No. Bring companies that I had used to sign up. There were multiple companies that had massive security breaches. So yeah, I care, I still have a Facebook and Instagram. Yeah, 

Limarc: [00:06:54] the exact same thing happened to me too, but like the password, they said, they knew it was like my password for my 2001 laptop that has no valuable data at all.

So I was like, okay, go ahead. Take that laptop. If you are on the subject of Lessening the data you put out into the world. Our next article is similar. It’s called have your PRI have your privacy cake on Android and eat it too. Amy, what is this one about? And what did you think of. 

Amy: [00:07:23] Yeah, it’s about the Android privacy protocol.

And honestly, I thought it was a bit confusing because it’s suggesting that instead of logging into your Google account to increase the level of privacy control that you have over your Android, you can log in via something else. I think it’s called aura Aurora store. And and then it basically works.

Like the Google play store except not. And then you don’t have to connect your Google account to your Android phone. But the reason that it confused me a bit is because from everything that I have heard or know about Android security, it’s like the number one thing is don’t download apps that are not from the Google play store.

So is the trade-off worth it? I don’t know. What do you think lemur? Yeah. 

Limarc: [00:08:17] So it’s interesting. The process, this guy describes is using a different one. That’s not the Android alas. And like you said, not using the play store, but the Aurora store, but like Google gives you that. Warning because of course they spend a lot of money to improve the cybersecurity of their specific store.

So that is true in a way, but also they’re cutting out the competition by saying don’t allow, download any other apps other than from our store, right? Yeah. Yeah. If I wanted to make one phone really private, I could see myself definitely doing this, but it depends. Cause if I want to use Facebook or YouTube or any product that requires a Google sign in on that phone, then I can’t anymore.

So maybe if I had a company or whatever, and I wanted to give all my employees work phones, maybe I might create a system like this, where. They don’t want to give Google any of their data. And we use limited apps, but for my personal day-to-day usage, I don’t think I care enough to do such a thing.

What about Ulm? Would you a wipe all of the Google apps from your phone like this. 

Ellen: [00:09:30] So in an ideal world I’m you know, firm believer in using encrypted email services and applications. And there are companies out there who specialize actually in this. So instead of relying on Google for the majority of a company’s sort of functionality there are options out there that are encrypted.

If I was setting something up. Be something that. Be at the top of my list to do right now it is what it is. I do, however, use an iPhone. So this doesn’t entice me.

Amy: [00:10:13] What I appreciated the most about this article though, is I, as it describes, like what you need to do, it also says that you need to create a work profile for In case you need like certain Google account things. So you create like a fo profile, but every time they mentioned the word work, it’s in quotes.

So it’s your work apps. 

Limarc: [00:10:36] Yeah, for sure. And wink. I think apple I’ve heard that. Yeah. IOS has a bit better security than Android. So maybe Ellen’s winning if she’s teaching TMI phone, but I’m never going to go back to T my phone. I can’t imagine what would make me go back to see my phone.

They’d have to have a really specific, exclusive app that I wanted. 

Ellen: [00:10:57] Yeah. I, I generally try and have both systems. Even with computers, I had both. However right now, I am considering maybe just getting another phone to keep work and personal separate. The thing though is I don’t have much going on personally.

Limarc: [00:11:21] You could get into this field, so polling for the personal stuff, but

Amy: [00:11:28] the girl that sent me a slack message at three in the morning yesterday. 

Ellen: [00:11:33] Okay. I’ve been wondering when that would come up because I’ll call him at four in the morning, like something really strange. But at what point are people are going to be like, Ellen, 

Amy: [00:11:42] do you sleep? 

Limarc: [00:11:44] I also forgot to introduce the author and I want you now because of his profile picture.

So his name is pizza Panther and he has the pink Panther. He’s a father web developer and pizza maker. And that’s all that matters is pizza. 

Amy: [00:11:59] Yeah, he’s a pizza Panther. I also noticed that and I appreciated it a lot, but I also appreciate the fact that it below it, we have the book to call button, but it says book to call, book a call with pizza Panther.

Limarc: [00:12:10] I like the fact that you’re putting that in your writer, bio, like he must make pizza from scratch. Like he makes the needs of the dough. This is my image. Yeah. 

Amy: [00:12:18] It’s gotta be good, Pete. 

Limarc: [00:12:21] Yeah. He might have like his own stone, a fire pizza. 

Ellen: [00:12:26] Yeah. Yeah. Currently that’s a thing. 

Limarc: [00:12:31] Speaking of fires, the next article is about how AI can spot wildfires faster than humans.

Beautiful segue.

This article is actually by someone I’ve known for a while. His name is Louis Bouchard. He’s a TA at a computer vision program and a university of Montreal. And he runs this YouTube right. An AI YouTube channel that’s it has grown really fast. So I want to talk to him about how he did that so we can match, but he basically posts these videos about different AI research papers or different AI tools that big companies have built.

And he explains it in a way that anybody can understand, which I think is really missing. Not just the YouTube space, but the AI space in general. So he does something really cool and we publish those videos on hacker noon. And this is one of them. Ellen, did you take a look at this video at all? And what did you think about.

Ellen: [00:13:30] I did. I thought it was fantastic. So what’s happening here is they’ve developed a AI program which can recognize if there’s any. Forest fire are starting. And so the reason why the AI is better than people is because it can monitor, various sections of the forest 24 7. And so it can recognize fire is and a potential problem, much faster than.

A human made mice because humans get distracted. They might not necessarily be paying attention 24 7. And so the rate at which the AI catches things compared to humans, it’s much, much faster. And this is. Fascinating because they talk about how they actually made this happen how they trained the artificial intelligence to be able to spot this.

And one of the things, so one of the things, of course, when you’re training artificial intelligence is the data is very important, so that the type of data that you’re training it on has to be a very good quality data because the better the data, the better the functioning of the AI, but what I think.

Was particularly incredible was they had 20 people manually labeling 9,000 images as precisely as possible. So they’d have, photos of the fire starting and they’d have to outline the part that was the smoke or the fire et cetera. And yeah, he goes into a lot more detail, but I just, I thought it was fantastic.

Limarc: [00:15:17] Cool. Yeah, and I worked in the AI training data industry for a while. And that process you just explained is actually a smaller. That’s like a smaller project. What if you look into like the projects that bigger companies do, 20 people and 9,000 images is nothing. There’s some projects that have hundreds or thousands of people doing the same thing.

So that’s pretty cool. But the reason I included this article in this week on planet internet is because it shows what we can build and the benefits of having it. Democratic data policy where we aren’t so private about data and what people can build with it. So my question to you, Amy, is when you see like the positive impact that AI can do to the world, based on the like enormous amounts of data, we feed it.

Do you think, like the positive impacts have a good enough trade off to like just accept the fact that we’re not going to be as private with our data in the future. 

Amy: [00:16:15] That’s really interesting. I was talking to someone on the podcast a little while ago about data for good initiatives. And I think that what you’re talking about, the data behind this like fire data or a data of wa wildlife or whatever it is, That kind of data.

We don’t really care about our privacy issues. Like it’s a different question. If you’re asking me about my personal data and what it’s being used for in AI, which is to sell me things. So in that sense, like it’s less, less kosher, I think, than using data for good in terms of helping with wildfires or whatever the cause might be.

So yeah, I think that’s where I’m falling. Yeah. 

Limarc: [00:17:02] I’m glad you said that because I prepared for such an answer. So a similar, like use case with a similar training method would be to create an AI that could detect. Danger in public spaces. So the AI could detect somebody pulling out a knife or pulling out a gun.

But in order to do that, you have to Phoebe AI hours and hours of video of public spaces. And that’s closer to the data you’re talking about. If that’s data of people walking around outside, or people going shopping in the store so that’s like pictures and images of you that are being fed to this thing, but it is for a good cause.

Thinking about that. I can throw the question out to both of you or Ellen, when you think about that, is that okay to give up your privacy and your data, if it’s to build tools like that?

Ellen: [00:17:56] I think when something is an ethical type of question, there needs to be a committee of that has a certain set of. Points a where they discuss like a sort of ethics. This is the 

Limarc: [00:18:13] committee, this is the committee we are discussing this. This is 

Amy: [00:18:17] welcome. It would 

Ellen: [00:18:19] be a, it’s a complicated, it’s a complicated thing.

Ultimately the decision would come down to whether the compromise in Personal information out ways, the overall societal benefit of using non-personal information. And so in the case of, or recognizing potentially violent criminals on the streets, It would have to, you’d have to look at the rate of that happening within a particular population.

And the ways in which the status could be abused by the government or the legal system. So it’s hard for me to just really decide without having. Looked at various data points about that information. The thing that I will say about fires, so is the, there’s quite a lot of fires right now in Ontario.

And the other day I woke up and everything smelled like smoke, and it’s the craziest thing to take fresh air for granted. Even though, I actively. Moved to an area with fresher air, like intentionally but nonetheless, to open a window expecting. Proper, just like a good smell to come in and have everything smell like smoke and fire is the craziest experience.

So definitely support this. 

Limarc: [00:19:50] That’s scary. Is the fire that close or is it, does it just travel that far? Like you’re not in a danger zone, but you can just smell the smoke from like a hundred miles away or what’s the 

Ellen: [00:19:59] situation. AI is so I don’t think it’s, it was too close. The fire the smoke, I think the smoke blew in this direction.

So has 

Amy: [00:20:10] this happened in your area before? I 

Ellen: [00:20:13] imagine it probably has, but in all the time, hi, like I’ve been here for almost, I think. Like nine or 10 years. And this is the first time that I’ve opened the door to my, and my mini balcony. It’s just been like, wow. 

Amy: [00:20:30] Okay. Because in BC where I’m from in Vancouver, it happens almost every.

We get a little bit of smoke fire coming from either California or Northern BC. But right now our province is in a state of emergency because there are so many wildfires burning right now. And I am in a similar boat. It’s not so bad in the lower mainland yet. When I opened my window, but it’s coming down and there are hundreds of wildfires burning in BC right now, destroying thousands and thousands of acres of trees and land and EV and houses.

So I also strongly believe in this cause, but to answer your original question, Lee mark, and to give maybe like a shorter, concise, more concise answer, I think no I don’t think that the that people should be Watching people, because anything, anytime data involving people is being collected, that someone’s using it for not good.

So yeah. I don’t think that should be a thing. And I think yeah, I agree with Ellen when it comes down to you the. Level of the rate of attacks or whatever the case, like the rate of X. And I don’t think that anything would be greater than the privacy of people in general. 

Limarc: [00:21:51] I see. I see.

Speaking of that, basically, I wanted to give a good example of people using AI for good. And the next article is about a gray area. Example. Item will decide whether you want to call it a good or bad, but it’s basically about how tick talk uses data to keep people on the platform. So this article is called hacking your mind with AI, the case of addictive Tech-Talk it’s written by Edwin  and.

It’s basically about as I said, how tectonic uses data to keep people onto the platform. So what did you think of this article? 

Amy: [00:22:31] It was interesting. I think that, from what I know about tick-tock, this makes a lot of sense because it’s based off of the amount of time that people spend watching a particular tick talk, the number of likes that it gets the number of shares that it gets.

And the more obviously interaction and time spent on a particular tick talk, the better it’s going to perform and push the algorithm up. And I think that Tik TOK has a really interesting algorithm because. The other social algorithms. It’s very easy, not very easy, but it’s much easier to have a viral video than it is on any other platform currently.

I think with the like case of addiction, this makes sense. And yeah, I think we’re going to continue to see that in like their algorithm and the way that tick-tock rose and. If you notice, like if you’re watching certain tick talks, you’ll start to see that at the very end of the video, they put like a little bit of oh shit.

I think I should watch that again so that you have to watch the whole thing again. And that just keeps you like watching and time spent on the video. Yeah. Sorry, 

Limarc: [00:23:33] what do you mean by, at the end? They make a trigger that makes. 

Amy: [00:23:37] Just like they add like a quick little, one second clip at the very end of the video that you’re like, wait, what was that?

And I got to go watch it again. So then you watch the whole 15 second video again, just to watch. Is that last second interest interesting.

Limarc: [00:23:51] Yeah. Yes. What about you, Ella? What do you think of the way tick-tock uses them? 

Ellen: [00:23:59] So I am not a tech talk user at this time. However, what I have heard about Tech-Talk is a number of different companies and legislators are concerned about the level of security there.

There’s this may very well tie into that by the time I don’t think, for this particular topic, I have too much to say I’m not not an expert on the tech talk. I have a 

Amy: [00:24:28] question mark about like your, cause you, I think you have more knowledge of AI than I do.

Do you know if. Using AI in a social ad algorithm is a common thing to do, or is this revolutionary? 

Limarc: [00:24:45] That’s a, that’s an important point to discuss. So I think a lot of people, when they think of AI, they immediately think of something that. Like self learns based on all the input you give it. And then like becomes this God machine, but it really isn’t when you look into AI there’s a reason why it says AI algorithms and AI models is because it’s just really a set of algorithms that trigger responses and it fakes, it simulates the way our brains work, but it is just math at the end of the day.

Like when we talk about AI algorithm of tick talk, we’re just talking about a more complicated version of something as simple as a spam filter, but they’re just using it for a different purpose. So no, it’s not revolutionary at all, but it also depends on what their weights are on the algorithms, like what they might be tracking and what’s, they might be The predictions they might be fighting from that data could be revolutionary.

It could be better than YouTube or Facebook’s. The only way the only people that would know is their data science team and how they were. Figure that out, but the idea of using AI to promote content on a social platform, isn’t revolutionary at all, but that’s the way that you said they’re doing it better than other people.

They must be doing something better and something different than the YouTubes and the Instagrams of the world. 

Amy: [00:26:06] Yeah. It’s just a completely different algorithm and different purpose, I guess like with Instagram, it’s harder as a brand for you to show up on someone’s feed. It’s easier for you to show up as a individual on someone’s feed with tick-tock.

It makes it really easy for brands to become viral because it doesn’t differentiate. 

Limarc: [00:26:26] Yeah and that’s important to you. Cause if in your algorithm you have a stronger weight on the amount of followers a brand has, then that in that sense, your algorithm is helping the rich stay rich. And it’s harder for the poor to get rich because it’s harder for somebody who has zero followers to get a thousand than it is for somebody who has a million to get a million a hundred thousand.

So it depends on the kind of platform you want to build. 

Amy: [00:26:52] Yeah. Yeah. And these nobody people and not just businesses can also become viral. Yeah. 

Limarc: [00:26:59] Yeah. But my question, whether this article was still the same. So like when you see what Tik TOK is doing, or not even just take talk, YouTube, Instagram, whatever you have it, the way that they track your data and the way you use those platforms to influence what on those platforms.

Do you think that’s okay because it makes you have a better experience on those platforms? Or do you feel like we’ve just gotten too complacent and we just feel like it’s normal. So everyone’s just going with the flow.

Amy: [00:27:33] No, I think I like it. I think I like it to have the kind of content that I want to have because my Instagram knows that I like dogs and bunnies and that’s great, but it doesn’t know my address, so I think I’m good with that. 

Limarc: [00:27:48] I see. What about you, Alan? What do you, how do you feel about any social media platform?

What kind of pushing content based on your activity? 

Ellen: [00:27:58] I think that sometimes what that leads to is an echo chamber. So if you have one particular viewpoint and the only content that you’re really seeing on these platforms that you’re interacting with on a daily basis, you’re really only getting one side of a particular story and it makes it very easy to manipulate a population.

I think it’s why. To have to create access to multiple perspectives, different types of contents. And this idea that, the big tech overlords are controlling, essentially what people large. Sections of the populations see, and think is very dangerous. It could be used in, and it is used in all sorts of different questionable political motives.

And so I think like anything it’s very questionable, it’s questionable. 

Limarc: [00:28:58] Do you agree with that point, Amy or you are on a different. 

Amy: [00:29:03] No, I think I’m probably about in the same boat. Yeah. 

Limarc: [00:29:07] I want to give a personal example of how, like, how like the reason I believe that’s so true and it’s because I moved to Japan after coming to Japan, it became so apparent that.

Whatever news is shown to you on the internet or on TV becomes your reality. And the reason I realized that is because the first time he came back to Canada after, I don’t know, let’s say a year or two years, there might be a song on the radio and I’d be like, who is this? And everybody around me, it’d be like, how do you not know X person?

So that’s like a simple way of explaining it, but the more. Interesting way is just in general news. Like I might, there might be somebody like a friend from Canada that says, oh yeah, man, did you hear about this social thing that happens? And everybody in Canada knows about it, but because I’m in Japan, none of us care about it.

So that reality doesn’t exist for me. And it’s super interesting that in the world and what’s pushed on your Facebook, feeds your Instagram feeds. That is your reality. And it’s strongly dependent on your location. And your reality can drastically change. If all of the news you’re being pushed is right-wing and all of the news you’re being pushed as left-wing and you just see one side of a topic and then it becomes reality for you.

Cause that’s all and I think some people might say, oh, it’s just social media. It doesn’t matter. I think like it or not, social media is the common newspaper. That’s the, that’s what we use now. Like people might say no, I don’t listen to it. But you do, you, you see these things on social media and people, it basically becomes what they, how they view them.

Amy: [00:30:44] It also depends who your friends are too, at least with Instagram, I believe because then it curates your explore feed or whatever they call it now. So based on the things that your friends that’s the kind of content that gets shown to you sometimes too. So it also depends who you’re following. Yeah.

And another point on that is with not just with like social issues. For me personally, I feel like because I’m so involved in the tech world and all of my friends are in tech and everything. Like sometimes I feel. Pressure. Cause I don’t I’m, I don’t know enough or I’m not good enough in tech, but then I think like I meet other people who are not in tech at all.

And they’re like, how do you know so much? And I’m like, oh yeah. It’s because my whole like social network and circle is all in tech. So everybody knows about tech, but then there’s people who are don’t know anything about tech and I’m like, I know way more than them. Yeah. It also depends on that too. I think 

Limarc: [00:31:40] for sure.

Yeah. Speaking of tech Amy’s been talking a lot about tech building there that recently went to space and she has, I think you have a few things to say about ideal our new astronaut and what’s the problem, Amy, what’s going on? Oh, what’s 

Amy: [00:32:00] the problem. Did you guys see the video of of them in space?

Limarc: [00:32:06] I saw the launch, but I didn’t see like the in shuttle video. Is it funny? Is it good? 

Amy: [00:32:11] No, it’s not funny. It’s like hurtful to my soul. They like, they are like floating around there and Jeff Bezos goes to the 18 year old and he’s Hey, throw me that. See, if you can catch this in your mouth and he like throws him a piece of candy or whatever, and he catches it and like jumping.

So was like, yeah oh my God, this is so much fun. We’re billionaires. And I’m like, oh God, the worst. But what I am excited about is that within the people that went to space was an 80 year old woman, which is very cool because she’s. Oldest woman to have become an astronaut. So that’s exciting for women in tech.

Limarc: [00:32:53] She can do that, but I can’t ride a roller coaster. That’s great. 

Amy: [00:32:56] Yep.  and then the eight. Yeah, the 18 year old. And then yeah, they were just like throwing ping POL ping pong balls around. It was super obnoxious. Go to Instagram, blue origin. And look up the video. Yeah, because it’s offensive.

Limarc: [00:33:16] Hold on. This is weird.

Amy: [00:33:21] No, actually go to their page, go to their page. Which page? 

Limarc: [00:33:24] Yeah. 

Ellen: [00:33:26] Yeah. 

Amy: [00:33:28] And then go to their IETV. 

Limarc: [00:33:31] Okay. 

Amy: [00:33:32] Okay. Okay. And then.

We have to watch this. 

Ellen: [00:33:39] Someone should describe what’s 


Amy: [00:33:41] on screen. They’re floating around in space and they’re literally just oh my God, this is so much fun. Look at us. Woo. And then they’re oh, now Jeff Bezos is holding hands with Wally the 80 year old woman who went to space.

They’re hugging, they’re floating upside down. I need to see the part where they have the ping pong balls. It’s slipping, Paul Mel’s that really offended me. Like of course you brought ping pong balls up to you. Space to have a little throw around. It’s the part that the Paro offends me, where they throw candy or whatever into each other’s mouths.

Like they’re just having a party up there while everybody else down here is poor. And there are people who are starving. There are people who need vaccines and they’re just throwing candy into each other’s mouth and space and just saying, oh my God, Fun. This is, I hate 

Ellen: [00:34:38] it. Someone absolutely needs to take the, everything that Amy was saying and do the overlay, the audio overlay over that video.

Don’t look at me in 

Limarc: [00:34:53] space and people need vaccines and they’re just floating. Yeah. What about the social distancing on that shuttle of blue 

Amy: [00:35:03] origin? There was none and also everybody is calling it the space penis, which is delightful.

Limarc: [00:35:16] Yeah. I don’t think I have as much inherent hates just because of like a screw that is screw the patriarchy and the Bush. Wow. That’s not my hate. It’s more what an unbelievable. Waste of money and time. If you’re going to do it, why on the 11 minutes? Why don’t you stay there for a couple hours?

You’re already up there? Like you’ve already taken the risk to get up there. I’m pretty sure launching is like the most dangerous part. Like why stay for 11 minutes go longer. Or like it costs millions of dollars to do that. Could you couldn’t you like launch a satellite at the same time or something like couldn’t you have made it useful.

Amy: [00:35:53] Yeah. Yup. I don’t know. 

Limarc: [00:35:57] Yeah, but we’ll never know until we reach that level of billionaire as themself, just wait, you’ll get there. And then we’ll find out why he did it, perhaps.

Ellen: [00:36:09] Yeah. When Amy millionaire is, she’ll be like, maybe I don’t hate them so much 

Amy: [00:36:15] now. Actually, that 18 year old son of a millionaire is cute. So if you want it.

Ellen: [00:36:23] Not all billionaires.

Limarc: [00:36:32] All right. Thanks for joining us on this week on planet internet. Once again, I’m the mark, your host joined by Amy, Tom, and Ella, and the Stevens. And thanks for joining and see you next time. Bye.

Podcast Hacker Noon profile picture


Join Hacker Noon