Monthly Archives: March 2021

News: ADL CEO Jonathan Greenblatt dives into tech’s reckoning with online hate

On January 6, America watched in horror as groups that recruited and organized on major social media sites violently attacked the seat of American democracy. Within a matter of hours, tech companies took actions they’d said we’re out of the question for years. But real change requires thoughtful policy and a clear-eyed look at the

On January 6, America watched in horror as groups that recruited and organized on major social media sites violently attacked the seat of American democracy. Within a matter of hours, tech companies took actions they’d said we’re out of the question for years.

But real change requires thoughtful policy and a clear-eyed look at the choices that allowed dangerous extremism to thrive in the first place. We spoke with Anti-Defamation League CEO Jonathan Greenblatt on proposed policy solutions and tech’s coming era of accountability at TechCrunch Sessions: Justice 2021.


On how the ADL ramped up its efforts in Silicon Valley:

Given the rise of online hate, harassment and dangerous misinformation, tech companies are increasingly on the radar for civil rights organizations. It’s now common to see organizations like the ADL to participate in pressure campaigns aiming to change platforms’ policies and sign onto legislation proposing regulations for the industry.

“So at the ADL, we’re the oldest anti hate organization in the world. But we deeply believe that today, the frontline and fighting hate is really on Facebook. I mean, there’s just no question that social media has become a breeding, breeding ground for kind of bigotry, that is offensive and ugly in all respects. Now, we’ve known this for years. But when I came on board, about five and a half years ago, I really wanted to focus on this, and try to get causal and right to the heart of the problem. So we could finally turn it around. So in 2017, we actually opened an office in Silicon Valley, our Center for Technology and Society, we were the first civil rights group with an actual presence in the valley. And for me, that was sort of second nature, because I had worked in the valley for years before taking this job, you know, both raising money on Sandhill road, managing teams of engineers building products.” (Timestamp: 0:53)


How algorithms make social media uniquely dangerous:

Algorithms are what sets social networks apart from more traditional media sources. Rather than seeking it out, the average internet user has extreme ideas served directly to them through algorithms that decide what they see. This is particularly an issue with Facebook and YouTube’s way of keeping users engaged for as long as possible.

Algorithmic amplification has a lot to do with the dilemma that we found ourselves in, and extremists are, if nothing else innovative, they exploit loopholes. And indeed, they have used the kind of libertarian laissez faire attitude of the companies to their own advantage for a number of years. And so from Facebook groups to YouTube channels to kind of accounts on Twitter, let alone all the other platforms, they’ve used them with tremendous depth, depth. So what’s interesting is, and many people that I know have seen this, I’ve seen this myself, you may have to I’m sure your audience has. It wasn’t too long ago that you might watch a YouTube video and one click or two clicks over, suddenly find yourself down the rabbit hole of some crazy QAnon or anti vaxxer you know, Boogaloo content. Same thing on Facebook.

When you search a piece of content, suddenly, you’re served up Facebook groups that may be from accelerationist, or white supremacists, or other racist and anti Semites. But the reality that we’ve got to confront is that algorithms aren’t our right, if you will, algorithmic amplification isn’t a privilege which should be accorded to everyone. It’s a responsibility that the companies have to make sure that their products give users what they want, but that they’re also not abused. And that the users themselves are not abused, to seeing the kind of things to which they might be very viable. Robots are susceptible. So we deeply believe that algorithmic amplification is very problematic. That’s why we’ve been supporting legislation on Capitol Hill that will finally address this… If you could basically turn off the algorithms for some of these worst elements, you could have curbed these issues a long time ago. (Timestamp: 13:35)


How social media companies failed before the Capitol attack:

In the immediate aftermath of the attack on the Capitol, social media companies suddenly made a number of changed that belied how reluctant they’d been to address the hate and extremism brewing on their platforms all along.

To those of us who’ve been tracking violent extremists for years. This was not a surprise at all, this was the most predictable terror attack in American history. Literally, these groups told us in advance what they were going to do. And the attack itself was sort of the culmination of years and in the last in the months prior intense campaigning by the President himself, to undermine the integrity of the election, to question the democratic process, to call on individuals to interrupt the certification of the election based on this big lie, this totally contrived idea that somehow the election was rigged. I mean, truly, it was bananas.

… The tech companies who for years have told us there was a political exemption, and they wouldn’t necessarily take action when presidents or other politicians said things that were outrageous, and committed slander or incited violence on the platform, suddenly, because of the public pressure from groups like Stop Hate for Profit and the ADL, from internal pressure from their own employees, and I believe, you know, their boards — suddenly they took action instantaneously, overnight. All their other concerns sort of fell by the wayside. I think it was really important that in order Facebook and Twitter and YouTube took down President Trump, that was critical, we called for them to do that. And I’m really pleased that they did. We called for them previously to take down armed militia groups, to take down QAnon content. And I’m really glad that they did and it had a huge impact.

You know, we’ve seen like on Twitter QAnon content drop 97% you know, just days after the attack, because the company actually took action. So I think it really laid bare the myth that somehow, some way the companies couldn’t do anything about this, clearly they could. And they did. And I think their services and society as a whole is better for it. (Timestamp: 7:53)


On Silicon Valley exceptionalism

The tech industry doesn’t think of itself like other traditional sectors of business, instead often casting its own grand pursuits as for the greater good — not just for profits. Those attitudes can contribute to some extraordinary innovations, but they also permeate its products and cultures in ways that create some serious problems.

I think Silicon Valley is almost like, rooted in this American tradition of like, Manifest Destiny, right? conquering the frontier. It’s, it’s ironic, but altogether appropriate, that’s happening in California, right in the land where they have the Gold Rush, right again, where people went to make their fortunes. And now they’re doing it today in Silicon Valley, in tech, and even that’s continued to evolve, right? It was the internet 15 years ago, five years ago, with social media. Today, it’s Clubhouse, and I don’t know what comes next. But I do think that the whole industry does need to undergo a serious self examination.

And I think you’ve seen people like who’ve come out of the industry, I think about Chris Sacca, the former Googler, I think about Alexis Ohanian, the Redditor, and a few others start to grapple with these issues. You know, trust, my friend, Tristan Harris, at the Center for Humane Technology has also done this in his film — The Social Dilemma really plays this out. Whereas Silicon Valley often has a very short memory, the reality is that we there will be a long road ahead of us. And if we don’t wrestle with these demons, and if we don’t sort again, through the wreckage to what they’ve wrought, I think the future is very unclear. (Timestamp: 20:15)


On policy solutions to rein in big tech

There’s a huge swath of policy proposals on the table that could put some real restrictions on how tech companies operate.  From proposed changes to Section 230 of the Communications Decency Act to federal and state antitrust suits, tech companies are on notice in 2021.

So look, the ADL, I mean, we’ve been literally fighting for a more just country, we’ve been fighting for civil rights, we’ve been fighting hate for over 100 years. And we are fiercely, ferociously, defenders of the First Amendment. But freedom of speech isn’t the freedom to slander people, right to freedom of expression isn’t the freedom to incite violence against individuals or groups of people based on their immutable characteristics? And so I think what we’ve seen is the first amendment been warped and weaponized online in ways that are, you know, completely beyond the pale of what the founding fathers ever would have, you know, could have imagined.

Section 230 does need to be addressed. And I think that Warner Hirono bill that you pointed out is a step in the right direction, it is definitely not sufficient… It might actually not be the federal government, but the states that actually pushed the companies to do more, we’ve seen California, do some innovative stuff on privacy that’s pushed the companies and you may see, I think more state action. (Timestamp: 16:28)

You can read the entire transcript here and review the full lineup from Justice 2021 [here].


Early Stage is the premier ‘how-to’ event for startup entrepreneurs and investors. You’ll hear first-hand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, product market fit, PR, marketing and brand building. Each session also has audience participation built-in – there’s ample time included for audience questions and discussion. Use code “TCARTICLE at checkout to get 20 percent off tickets right here.

 

News: NFTs are changing cultural value creation

Hello and welcome back to Equity, TechCrunch’s venture capital-focused podcast, where we unpack the numbers behind the headlines. This is our Wednesday show, where we niche down and focus on a single topic, or theme. This is our sweet spot: going beyond definitions and into the dirty and deep impact of how a phenomenon could impact

Hello and welcome back to Equity, TechCrunch’s venture capital-focused podcast, where we unpack the numbers behind the headlines.

This is our Wednesday show, where we niche down and focus on a single topic, or theme. This is our sweet spot: going beyond definitions and into the dirty and deep impact of how a phenomenon could impact startups and tech. We are hoping to explore more than answer, and debate more than agree.

NFTs, or non-fungible tokens, is this week’s topic! This is something that you have nearly certainly heard of in the past few weeks but probably don’t understand with perfect clarity. While we’ve all seen the Twitter threads of basic definitions, consider this episode the appetizer to your aperitif understanding.

The Equity team put on our research caps, dug in, and found quite a lot to like. But we did not tread alone: our EIC Matthew Panzarino joined Chris and Alex and Danny and Natasha to help us out. Panzer was early to the NFT world and has contributed some of TechCrunch’s reporting on the matter.

So, what did we get into? More than a little:

We spent a few minutes on the NFT basics, including historical examples and how NFTs are minted, as well as some examples of how they have been used recently.

From there we riffed on use-cases more broadly, and where we might find NFTs in the wild. Sure, we talked about visual art, but also music, tickets and sports moments. The NFT world has the possibility of a large remit if it plays its, ahem, tokens correctly.

Then we talked culture. What could it mean that NFTs are in the market? Could residual incomes from the reselling of NFTs constitute a material revenue base for future artists, and how broad can the value-experiment go? Depending on which side of the NFT hype-cynicism divide you land, there’s plenty of room for discussion. A point made by Panzer:

NFT’s and the architecture of smart contracts and the way that social tokens work, these are all opportunities for the creators and originators of culture, to finally take part and participate in their rewards of the platforms of that culture — you know, that hosts that culture. Because we’ve seen it over and over again: Artist blows up on TikTok, and you know, somebody does a dance to them, and then that video blows up. What does the artist get out of it? Sometimes they get a recording deal. Many times they get nothing. Right? In Vine, famously built on Black creators and brown creators and Latino creators and Latino creators. You know, TikTok, very much the same. Black Twitter one of the early driving forces of engagement on Twitter and culture on Twitter — how many of them were actually able to participate in the economic rewards of Twitter as a platform selling advertising and making millions of dollars and their stock going bonkers? Besides, of course, you know, maybe they were able to purchase stock, right? So the, the remapping of how creators can participate in that economy directly by saying, “Hey, I’ve created something of value, and I’d love to connect directly with the people that enjoy that and they can provide me value back” — that’s what’s so exciting about this.

And we chatted just a minute about the weight, or carbon footprint, of different blockchains. There’s real nuance to this point of argument, but it was also something we couldn’t avoid. Panzer again:

And this is probably the biggest negative blowback on Ethereum and NFTs is that Ethereum is by nature a very heavy chain, which means that it takes a lot of work to prove that a block has been written to the chain. Not quite as heavy as Bitcoin, but it’s up there. And that energy usage that was used to mine that Ethereum that’s being spent on the chain to confirm a new transaction is being sort of credited forwards in– for lack of a better term to the artists minting on it. I don’t think that’s absolutely fair. But it’s absolutely fair to acknowledge that it does have an ecological impact.

Every week Equity will bring you something special on Wednesdays, adding to our regular Monday (weekly kickoff!) and Friday (news roundup!) shows. The world of tech is large, diverse, and variously dangerous and delightful. We’re excited to keep talking through it with you.

Equity drops every Monday at 7:00 a.m. PST, Wednesday, and Friday morning at 7:00 a.m. PST, so subscribe to us on Apple PodcastsOvercastSpotify and all the casts.

News: SIP Global Partners announces first close of its $150M fund to bring U.S. startups into Japan

Japan is often under discussed as an expansion target for American startups, but in the past few years it has become a top market for companies like Slack, Salesforce, Twitter and, more recently, Clubhouse. Today SIP Global Partners is announcing a new fund to invest in early-stage U.S. startups that have the potential to expand

Japan is often under discussed as an expansion target for American startups, but in the past few years it has become a top market for companies like Slack, Salesforce, Twitter and, more recently, Clubhouse. Today SIP Global Partners is announcing a new fund to invest in early-stage U.S. startups that have the potential to expand into Japan, and potentially other Asian markets, too. The fund has raised a $75 million first close of its $150 million target, and already invested in five companies.

SIP’s new fund will look at late seed to Series B-stage companies that have a product, or one about to come to market, that is ready to expand internationally. The team will work closely with portfolio companies, helping them launch operations in Japan and other Asian markets.

Managing partner Justin Turkat told TechCrunch that Japan is a promising market for foreign startups partly because an undercapitalized venture capital ecosystem means there is a smaller pool of entrepreneurs, with many of the country’s top tech talent opting to join conglomerates or the government instead.

While Japan’s startup market has a lot of potential, he added, it is still nascent. On the other hand, Japan is now the largest source of outward foreign direct investment in the world, and with about 125 million consumers and large corporations in need of scalable solutions, it’s a ripe market for new tech.

“If you look at what’s happened in the last couple of years, I think Japan is open for business with U.S. startups with an urgency that I’ve never seen before, and we think there is a lot of tailwinds around it. You look at investments and partnerships with U.S. startups, it’s at record levels over the last five years and deal counts are increasing every year,” Turkat said.

The fund is being launched by four investors, based in the U.S. and Japan. Turkat and founder and managing partner Shigeki Saitoh, former director of the Japan Venture Capital Association, are in Tokyo, while general partner Jeffrey Smith and founder and managing partner Matthew Salloway are in Boston and New York, respectively.

“The reason we started this really has to do with the team. We’ve all dedicated our careers to cross border, as both operators and investors, across the U.S. and Asia,” Turkat said. “All the four partners on average have about 20-plus years of experience doing this.”

Over the years, they’ve observed global expansion happening earlier in a startup’s life, he added. “I think it used to be an axiom that if you’re a U.S. startup and you’re venture-backed, you’re not thinking of expanding overseas until your Series D round,” but companies are now eyeing foreign markets as early as their seed rounds.

SIP’s new fund is looking for startups in three areas: creativity (augmented and extended reality, synthetic media and web-based platforms), productivity (artificial intelligence and machine learning, edge computing, the Internet of Things and semiconductors) and safety (digital health and information security).

Turkat said it is focusing on companies that provide core infrastructure or the economic layer for emerging technology.

For example, “on the infrastructure layer, we’re looking at 5G being rolled out globally simultaneously, then the edge computing, semiconductors, security and AI and machine learning, all around this infrastructure layer,” he said. Companies in the fund’s current portfolio that fit into this category include OpenRAN startup Parallel Wireless and Croquet, an ultra-low latency collaboration platform.

“Then you have the economic layer with all of these advancements, the platforms and applications sitting on top of it,” Turkat added. These include the fund’s three other investments so far: Fable, a browser-based motion design platform, Tilt Five, an AR gaming platform, and Kinetic, an industrial IoT startup focused on workplace safety.

As a strategic investor, SIP works closely with startups as they expand into new countries. This includes hiring talent and finding initial business partners, including for distribution channels or potential joint ventures. After Japan, SIP also helps startups enter other Asian markets, especially in ASEAN, including Thailand, Vietnam and Indonesia.

News: DataGrail snares $30M Series B to help deal with privacy regulations

DataGrail, a startup that helps customers understand where their data lives in order to help comply with a growing body of privacy regulations, announced a $30 million Series B today. Felicis Ventures led the round with help from Basis Set Ventures, Operator Collective and previous investors. One of the interesting aspects of this round was

DataGrail, a startup that helps customers understand where their data lives in order to help comply with a growing body of privacy regulations, announced a $30 million Series B today.

Felicis Ventures led the round with help from Basis Set Ventures, Operator Collective and previous investors. One of the interesting aspects of this round was the participation from several strategic investors including HubSpot, Okta and Next47, the venture firm backed by Siemens. The company has now raised over $39 million, according to Crunchbase data.

That investor interest could stem from the fact that DataGrail helps organizations find data by building connectors to popular applications and then helps ensure that they are in compliance with customer privacy regulations such as GDPR, CCPA and similar laws.

“DataGrail [is really] the first integrated solution with over 900 integrations (up from 180 in 2019) to different apps and infrastructure platforms that allow the product to detect when new apps or new infrastructure platforms are added, and then also perform automated data discovery across those applications,” company CEO and co-founder Daniel Barber explained to me. This helps users find customer data wherever it lives and enables them to comply with legal requirements to manage and protect that data.

Victoria Treyger, general partner at lead investors Felicis Ventures says that one of the things that attracted her to DataGrail was that she had to help implement GDPR regulations at a previous venture and felt the pain first hand. She said that her firm tends to look for startups in large markets where the product or service being offered is a critical need, rather an option, and she believes that DataGrail is an example of that.

“I really liked the fact that privacy management is such a hard problem, and it is not optional. As a business, you have to manage privacy requests, which you may do manually or you may do it with a solution like DataGrail,” Treyger told me.

HubSpot’s Andrew Lindsay, who is SVP of corporate and business development, says his company is both a customer and an investor because DataGrail is helping HubSpot customers navigate the complexity of privacy regulation. “DataGrail’s unique ecosystem approach, where they are integrating with key Saas and business applications is an easy way for many of our joint customers to protect their customers’ privacy,” Lindsay said.

The company has 40 employees today with plans to grow to 90 or 100 by the end of this year. It’s worth noting that Treyger is joining the Board, which already has 3 other women. That shows shows a commitment to gender diversity at the board level that is not typical for startups.

News: Tackling deep-seated bias in tech with Haben Girma, Mutale Nkonde, and Safiya Noble

Advances in technology provide all kinds of benefits, but also introduce risks — especially to already marginalized populations. AI for the People’s Mutale Nkonde, disability rights lawyer Haben Girma, and author of Algorithms of Oppression Safiya Umoja Noble have studied and documented these risks for years in their work. They joined us at TC Sessions:

Advances in technology provide all kinds of benefits, but also introduce risks — especially to already marginalized populations. AI for the People’s Mutale Nkonde, disability rights lawyer Haben Girma, and author of Algorithms of Oppression Safiya Umoja Noble have studied and documented these risks for years in their work. They joined us at TC Sessions: Justice 2021 to talk about the deep origins and repercussions of bias in tech, and where to start when it comes to fixing them.


On bias in tech versus bias in people

When it comes to identifying bias in tech, there are two ways of coming at it: the tech itself and the people who are putting it to work. A facial recognition system may be be racist itself (such as working poorly with dark skin) or used in furtherance of racist policies (like stop and frisk).

Nkonde: There is the problem of technologies which are inherently racist, or sexist, or ableist, as Haben so beautifully pointed out. But there is another part… an imagination for technologies that could actually serve all people. And if the if the scientists who are creating those technologies don’t have experience outside of their own experiences, and we’re sitting in a moment where Google AI has got rid of [Margaret] Mitchell and Timnit Gebru, both of whom were technologists from, researchers from, minoritized communities who are thinking about new and different ways that tools could be designed… then you may not see them coming to products. I’d say that the two are definitely married. (Timestamp: 3:00)


On the danger in ‘banal’ technologies

Bias does not only exist in controversial tech like facial recognition. Search engines, algorithmic news feeds, and other things we tend to take for granted also can contain harmful biases or contribute to them.

Noble: My concerns were with what we might think of as just banal technologies, things that we really don’t give a second thought to, and that also present themselves as widely neutral, and valuable. Of course this is where I became interested in looking at Google search, because Google’s own kind of declaration that they were interested in organizing all the world’s knowledge, I think was a pretty big claim. I’m coming out of the field of Library and Information Science and thinking about, I don’t know, thousands of years of librarians, for example, around the world, who have been indeed organizing the world’s knowledge, and what it means to have an advertising company, quite frankly, data mine our knowledge, but also commingle it with things like disinformation, propaganda, patently false information and ideas, and really flatten our ability to understand knowledge and good information. (Timestamp: 5:13)


On how excluding groups harms them twice over

Haben Girma, who is deaf and blind, has advocated for accessibility with the skills she learned at Harvard Law. But the lack of accessibility goes deeper than simply not captioning images properly and other small tasks.

Girma: So most of the technology that’s built was not imagined for disabled people, which is frustrating… and also absolutely ridiculous. Tech has so much potential to exist in visual forms, in auditory forms, in tactile forms, and even smell and taste. It’s up to the designers to create tools that everyone can use. (Timestamp: 0:56)

A disturbing viral trend on TikTok recently questioned the story of deafblind icon Helen Keller. Doubt that she existed as described or did the things she did was widespread on the platform — and because TikTok is not designed for accessibility, others like Keller are excluded from the conversation and effectively erased from consideration in addition to being the subject of false claims.

Girma: Deafblind people have used technology for quite a while, and were early users of technology, including being designers and engineers. We are on many of the social media platforms, there are blind and deaf blind people on Twitter. TikTok was not designed with accessibility in mind.

When you have a space where there are few disabled people, ableism grows. People on TikTok have questioned the existence of Helen Keller, because the people on the platform can’t imagine how a deafblind person would write a book, or travel around the world. Things that are well documented that Helen Keller did. And there’s also lots of information on how blind and deaf blind people are doing these things today, writing books today, using technology today. So when you have these spaces where there are no disabled people, or very few disabled people, ableism and negative biases grow more rapidly. And that’s incredibly harmful, because the people there are missing out on talented, diverse voices. (Timestamp: 12:16)


On tech deployed against black communities

The flip side of racism within tech is ordinary tech being used by racist institutions. When law enforcement employs “objective” technology like license plate readers or biometric checks, they bring their own systematic biases and troubling objectives.

Nkonde: One of the things that that really brought me to was this whole host of technologies that when used by security forces, or police, reinforce these discriminatory impacts on black communities. So that could be the way license plate readers were used by ICE to identify cars, and when they pulled people over, they would do these additional biometric checks, whether it was fingerprinting or iris readers, and then use that to criminalize these people onto the road to deportation. (Timestamp: 17:16)

And when the two forms of bias are combined, certain groups are put at serious disadvantage:

Nkonde: We’re seeing how all of these technologies on their own, are impacting black lives, but imagine when all of those technologies are together, imagine when, here in New York, I walked to the subway to take a train because I have to go to work. And my face is captured by a CCTV camera that could wrongly put me at the scene of a crime because it does not recognize my humanity, because black faces are not recognized by those systems. That’s a very old idea that really takes us back to this idea that black people aren’t human, they’re in fact three fifths of a human, which was at the founding of this country, right? But we’re reproducing that idea through technology. (Timestamp: 19:00)


On the business consequences of failing to address bias and diversity

While companies should be trying to do the right thing, it may help speed things up if there’s a financial incentive as well. And increasingly there is real liability resulting from failing to consider these problems. For instance, if your company produces an AI solution that’s found to be seriously biased, you not only lose business but may find yourself the subject of civil and government lawsuits.

Noble: I think that first of all, there’s a tremendous amount of risk by not taking up these issues. I’ve heard that the risk management profile, for example for a company like Facebook, in terms of harm, what they can’t solve with software and AI, that they use human beings, quite frankly to sort through, for example, the risk that they face is probably estimated around $2 billion, right?

If you’re talking about a $2 billion risk, I think then this is a decision that exceeds the design desires and software engineers. (Timestamp 24:25)

Not just bias but unintended consequences need to be considered, such as how an app or service may be abused in ways the creators might not have thought of.

Noble: I think you have to think far beyond, you know, like, what you can do versus what you should do, or what’s ethical and responsible to do and I think these conversations now can no longer be avoided. This is a place where founders, venture capitalists, everything, every VC in the Valley on Sandhill road should have a person who is responsible for thinking about the adverse effects of the products that they might invest in. (Timestamp: 25:43)


On getting people in the room before, not after the crisis

The tendency to “ship it and fix it” rather than include accessibility from the ground up is increasingly being questioned by both advocates and developers. Turns out it’s better for everyone, and cheaper in the long run, to do it right the first time.

Girma: The answer to most of these questions is have the people involved. ‘Nothing about us without us’ is the saying in the Disability Justice Movement, so if these seas and companies are thinking about investing in a solution that they think will be good for the world? Ask disability justice advocates, get us involved. (Timestamp: 29:25)

We need the VCs to also connect with Disability Justice advocates, and really find someone who has knowledge and background in accessibility and tech. Same thing for any company. All the companies should have, technology existing and tech in the process of being built, should be consulting on accessibility. It’s easier to make something accessible if you design for accessibility, rather than trying to make it accessible afterwards. It’s like having an elevator in a physical building. You don’t build the structure, and then think about adding an elevator. You think about adding an elevator before you design it. (Timestamp: 30:55)

Read the full transcript here.


Early Stage is the premier ‘how-to’ event for startup entrepreneurs and investors. You’ll hear first-hand how some of the most successful founders and VCs build their businesses, raise money and manage their portfolios. We’ll cover every aspect of company-building: Fundraising, recruiting, sales, product market fit, PR, marketing and brand building. Each session also has audience participation built-in – there’s ample time included for audience questions and discussion. Use code “TCARTICLE at checkout to get <a href=”http://techcrunch.com/events/tc-early-stage-2021-part-1?promo=tcarticle&display=true”>20 percent off tickets right here.

News: Populus AI plots expansion with $5M in new funding

The wave of shared electric scooters that swept through cities several years ago helped Populus AI get its start. Now, surging demand for delivery — and the pressure it places on curb space — is helping the transportation data startup attract new capital and expand to more cities. Populus, a San Francisco-based startup founded in

The wave of shared electric scooters that swept through cities several years ago helped Populus AI get its start. Now, surging demand for delivery — and the pressure it places on curb space — is helping the transportation data startup attract new capital and expand to more cities.

Populus, a San Francisco-based startup founded in 2017, has raised $5 million from new investors Storm Ventures and contract manufacturing and supplier company Magna along with existing backers Precursor, Relay Ventures and Ulu Ventures. The company has raised nearly $9 million to date.

Populus plans to use that capital to expand to more cities, growth that the company believes will be driven by demand for street and curb management. Populus has contracts with more than 80 cities, including Oakland, San Diego, and Tel Aviv, and works with more than 25 micromobility operators. Co-founder and CEO Regina Clewlow said their aim is to triple the number of cities over the next 18 months.

The Populus platform is a software as a service product that operates like a two-way street. The company pulls data from fleets of shared ebikes, scooters, mopeds and car-sharing and delivers that information to cities to help planners and regulators understand and manage how streets and curbs are used. Cities can also use the Populus API to share its rules of the road — restrictions on motorized vehicles, preferred scooter parking areas and information on bike lanes, for instance — to mapping platforms and any other third party.

“In recent years, there has been significant growth in venture-backed startups delivering software to cities, especially as transportation becomes increasingly connected and automated,” Frederik Groce, a partner at Storm Ventures and founder of BLCK VC said in a statement “Populus is uniquely positioned as the market leader to support cities’ digital transformation.”

Last year, Populus added a street manager to the platform to allow cities to communicate new policies such as slow or shared streets that prioritize bikes and pedestrians, areas designated for outdoor dining and construction closures.

The curb management feature, which was also added last year, will be the main driver of growth in 2021, Clewlow said. Cities can use that data to set dynamic pricing for curb space, for example.

“What most cities really want to use our digital technology for is managing commercial fleets including delivery,” Clewlow said. Curb space is being used by both scheduled and on-demand vehicles, she said, adding that these areas are not designed for the volume of deliveries that occur today.

“Cities are continuing to see a boom in delivery; that’s a trend that predated COVID and obviously accelerated during COVID,” Clewlow said. “A real pain point for cities is managing how that space is used by commercial delivery vehicles.”

News: Adobe delivers native Photoshop for Apple Silicon Macs and a way to enlarge images without losing detail

Adobe has been moving quickly to update its imaging software to work natively on Apple’s new in-house processors for Macs, starting with the M1-based MacBook Pro and MacBook Air released late last year. After shipping native versions of Lightroom and Camera Raw, it’s now releasing an Apple Silicon-optimized version of Photoshop, which delivers big performance

Adobe has been moving quickly to update its imaging software to work natively on Apple’s new in-house processors for Macs, starting with the M1-based MacBook Pro and MacBook Air released late last year. After shipping native versions of Lightroom and Camera Raw, it’s now releasing an Apple Silicon-optimized version of Photoshop, which delivers big performance gain vs. the Intel version running on Apple’s Rosetta 2 software emulation layer.

How much better? Per internal testing, Adobe says that users should see improvements of up to 1.5x faster performance on a number of different features offered by Photoshop, vs. the same tasks being done on the emulated version. That’s just the start, however, since Adobe says it’s going to continue to coax additional performance improvements out of the software on Apple Silicon in collaboration with Apple over time. Some features are also still missing from the M1-friendly addition, including the ‘Invite to Edit Cloud Documents’ and ‘Preset Syncing’ options, but those will be ported over in future iterations as well.

In addition to the Apple Silicon version of Photoshop, Adobe is also releasing a new Super Resolution feature in the Camera Raw plugin (to be released for Lightroom later) that ships with the software. This is an image enlarging feature that uses machine learning trained on a massive image dataset to blow up pictures to larger sizes while still preserving details. Adobe has previously offered a super resolution option that combined multiple exposures to boost resolution, but this works from a single photo.

It’s the classic ‘Computer, enhance’ sci-fi feature made real, and it builds on work that Photoshop previously did to introduce its ‘Enhance details’ feature. If you’re not a strict Adobe loyalist, you might also be familiar with Pixelmator Pro’s ‘ML Super Resolution’ feature, which works in much the same way – albeit using a different ML model and training data set.

Adobe's Super Resolution comparison photo

Adobe’s Super Resolution in action

The bottom line is that Adobe’s Super Resolution will output an image with twice the horizontal and twice the vertical resolution – meaning in total, it has 4x the number of pixels. It’ll do that while preserving detail and sharpness, which adds up to allowing you to make larger prints from images that previously wouldn’t stand up to that kind of enlargement. It’s also great for cropping in on photos in your collection to capture tighter shots of elements that previously would’ve been rendered blurry and disappointing as a result.

This feature benefits greatly from GPUs that are optimized for machine learning jobs, including CoreML and Windows ML. That means that Apple’s M1 chip is a perfect fit, since it includes a dedicated ML processing region called the Neural Engine. Likewise, Nvidia’s RTX series of GPUs and their TensorCores are well-suited to the task.

Adobe also released some major updates for Photoshop for iPad, including version history for its Cloud Documents non-local storage. You can also now store versions of Cloud Documents offline and edit them locally on your device.

News: Israel’s Retrain.ai closes $13M to use AI to understand early signals in the changing jobs market

Israel’s retrain.ai, which uses AI and Machine Learning to read job boards at scale and to gain insight into where the job market is going, has closed $9M Series A led by Square Peg. Since retrain.ai’s $4M seed round last year was unannounced (led by Hetz Ventures, with TechAviv and .406 Ventures participating) that means it’s

Israel’s retrain.ai, which uses AI and Machine Learning to read job boards at scale and to gain insight into where the job market is going, has closed $9M Series A led by Square Peg. Since retrain.ai’s $4M seed round last year was unannounced (led by Hetz Ventures, with TechAviv and .406 Ventures participating) that means it’s raised a total of $13 million. It’s competitors include Pymetrics which has raised $56.6M and Eightfold.ai which has raised $176.8M.

As well as the funding, the company has secured a first deal with the Israeli Department of Labor to look at the changing nature of the Israeli job market in light of the pandemic.

With technology eating into the traditional labour market, retrain.ai says its platform can look at what jobs are being advertised, which jobs are going down in popularity and see early-warning signals as to where new jobs are going to appear from. This can help form policy for large organiations and governments.

retrain.ai’s CEO is Dr. Shay David, who is best known for co-founding the video enterprise leader Kaltura, which first appeared at TehcCrunch’s first ever conference in 2007. Isabelle Bichler-Eliasaf is the company’s COO and Avi Simon, is retrain.ai’s CTO.

Dr. Shay David said: “What was once the regular tide of change in the workforce has evolved into a tsunami, especially pronounced by COVID-19 and its huge impact on the labor market– this has been a wake-up call. Unemployment and underemployment  is going to affect a billion people globally in the next few decades. Our vision is to help 10 million workers get the right jobs by 2025 and help organizations navigate efficiently through the wave of change.”
 
retrain.ai is the first investment by Square Peg’s new $450M fund. The VC previously invested in Canva, Stripe, Fiverr and Airwallex.

News: TikTok rolls out new commenting features aimed at preventing bullying

On the heels of last week’s launch of a new Q&A format for creators responding to viewer questions, TikTok today announced it’s rolling out new commenting features. Creators will now be able to control which comments can be posted on their content, before those comments go live. Another new addition, aimed at users who are

On the heels of last week’s launch of a new Q&A format for creators responding to viewer questions, TikTok today announced it’s rolling out new commenting features. Creators will now be able to control which comments can be posted on their content, before those comments go live. Another new addition, aimed at users who are commenting, will pop up a box that prompts the user to reconsider posting a comment that may be inappropriate or unkind.

TikTok says the goal with the new features is to maintain a supportive, positive environment where people can focus on being creative and finding community.

Image Credits: TikTok

Instead of reactively removing offensive comments, creators who choose to use the new “Filter All Comments” feature will instead get to choose which comments appear on their videos. When enabled, they’ll need to go through each comment individually to approve them using a new comment management tool.

This feature builds on TikTok’s existing comment controls, which allow creators to filter spam and other offensive comments or filter by keywords, similar to other social apps like Instagram.

Image Credits: TikTok

But Filter All Comments means comments won’t even go live at all unless the creator approves them. This gives creators full control over their presence on the platform and could prevent bullying and abuse. But it also could allow creators to get away with spreading false information without any pushback, or making it seem like they’re more well liked than they actually are. That could be troublesome — especially since brands making decisions about which creators to work with to promote their products could be getting the wrong impression about a user’s likability.

The other feature will instead push users to reconsider posting bad comments — meaning those that appear to be bullying or inappropriate. It will also remind users of TikTok’s Community Guidelines and allows users to edit their comments before sharing.

Image Credits: TikTok

These sort of “nudges” help by slowing people down and giving them time to pause and think about what they’re saying, instead of being so quick to react. Already, TikTok is using nudges to ask users whether or not they want to share unsubstantiated claims that fact checkers can’t verify, in an attempt to slow the spread of misinformation.

It took other social networks years to add prompts that ask users to stop and think before posting. Instagram, for example, launched in 2010 but it was nearly a decade later before it decided to try a feature that prompted users to reconsider before posting offensive comments. Twitter, meanwhile, just last month said it was running another test that asks users to reconsider harmful replies. It’s been running variations of this same test for nearly a year.

Social networks have been hesitant to build in more prompts like this to their platforms, even though they’ve demonstrated a strong ability to influencer user actions. When Twitter began prompting users to read articles linked in a tweet before retweeting, for instance, users would open those articles 40% more often. But more often that not, networks choose to downrank or hide negative comments, like Instagram does with “View Hidden Comments” or Twitter with “Hide Replies.” 

TikTok says it’s consulting with industry partners in developing its new policies and features, and today also announced a partnership with the Cyberbullying Research Center (CRC), which develops research around cyberbullying and online abuse and misuse. The company says it will collaborate with CRC to develop other initiatives going forward to help promote a positive environment.

 

News: Tetrate, the company born out of Istio’s open source app networking project, raises $40 million

Tetrate, the company commercializing an open source networking project that allows for easier data sharing across different applications, has raised $40 million. The round, led by Sapphire Ventures underscores the importance of the Istio project and just how critical services that facilitate cross-platform data sharing have become. Sapphire was joined by other new investors including Scale

Tetrate, the company commercializing an open source networking project that allows for easier data sharing across different applications, has raised $40 million.

The round, led by Sapphire Ventures underscores the importance of the Istio project and just how critical services that facilitate cross-platform data sharing have become.

Sapphire was joined by other new investors including Scale Venture Partners and NTTVC, along with existing investors, Dell Technologies Capital, Intel Capital, 8VC, and Samsung NEXT.

The company said it would use the cash to further develop its hybrid cloud application networking platform and support a new product, based on Istio, that makes the application service mesh easier to use, according to a statement from the company. Geographic expansion to Latin America, Europe and Asia is also on the menu now that it has 40 million simoleons to play around with (personally I’d have converted all that money into bills and gone swimming in it like Scrooge McDuck).

“As the microservices revolution picks up steam, it’s indispensable to use Istio for managing applications built with microservices and deployed on containers. Both the product and background of the founding team lead us to believe that Tetrate is poised to bring Istio into the mainstream for enterprises by making it easy to manage and deploy on multi-cloud and hybrid cloud environments,” said Jai Das, the partner, president and co-founder of Sapphire’s multi-billion dollar firm, who’s joining the Tetrate board. “The applications we use daily require a lot of work in the background, and Tetrate helps make that happen with its Istio-based service mesh technology, which helps route traffic between microservices, add visibility and enhance security.”

Founded in 2018, Tetrate formally launched in 2019 with a $12.5 million round that boosted the company’s profile and helped the company commercialize and professionalize services around the Istio and Envoy Proxy open source projects.

Tons of really big customers, including the U.S. Department of Defense use Tetrate’s services currently. In the military, Tetrate powers the DevSecOps platform called Platform One.

“We partnered with Tetrate to help secure and smoothly operate Platform One with Istio. Platform One works with the most critical systems across the DoD. The Tetrate team has provided world class expertise, trained our team members, reviewed our platform architecture and configurations, and helped with debugging and upgrades,” said Nicolas Chaillan, the chief software officer for the US Air Force, in a statement. “We’re getting excellent production support for running our platform smoothly and we rely on them and their platform for a critical layer of our stack.”

WordPress Image Lightbox Plugin