Nine great industry news stories from this week you may have missed.
More women must enter male-dominated legal tech to guard against AI bias
At last month's Global Legal Forum in London, keynote speaker Christina Blacklaws presented the idea that more women need to pursue careers in legal tech to ensure bias doesn't get "hardwired" into AI-assisted decision making. According to Blacklaws, the industry's innovations are currently being weighed down by a systemic bias. Read the full article to learn why, if more women don't come on board, the future of AI in legal tech could be in danger.
Why Amazon knows so much about you
When the technology desk editor at the BBC requested Amazon to disclose everything they knew about him, the level of detail was mind-bending. In the hundreds of files he received, the database showed to have transcripts of all 31,082 interactions his family has had with Alexa — including the precise number of times his daughter had requested to hear "Let it Go" from Frozen (it was 48). Audio clips were provided too. Another file revealed 2,670 product searches Leo had carried out within the Amazon store since 2017. The database could even show which device he was using and in which area.
For many of us, this level of detail may sound over the top. Amazon's CEO Jeff Bezos frames it as "customer obsession." Amazon knows not just your preferences, but the million of preferences of customers that shop like you. Learn the story of how Amazon took of from a small online bookstore to the global success it is today, all while collecting data each step of the way.
What will the law firm of the 2020s look like?
Legal tech is here to stay. A panel of lawyers gathered to discuss the future of law firms with Legal Cheek, and this was one answer the group answered unanimously. The panel agreed technology will continue to reshape legal practice, freeing lawyers up to focus on more stimulating work. Learn more here about their thoughts on why core skills won't change, how the Green Economy will come into play, and what the political and economic uncertainty could bring to law firms this decade.
AI and the Law: Helping lawyers while avoiding biased algorithms
AI bias is a hot topic this week. When it comes to the world of legal tech, the issue of identifying the many rules that go along with each specific sector can be a challenge. Tax law, for instance, has a lot more explicit rules compared to criminal law. Blue J Legal, which arose from a University of Toronto course, is working to combine legal and computer science skills to solve the problem. Read Forbes.com's recent article to learn how they're helping lawyers evaluate their cases.
Your doorbell camera spied on you. Now what?
Ring, which is owned by Amazon, has generated its share of headlines, including how the company fired four employees over the last four years for watching customers’ videos. Last month, security researchers also found that Ring’s apps contained hidden code, which had shared customer data with third-party marketers. And in December, hackers hijacked the Ring cameras of multiple families, using the devices’ speakers to verbally assault some of them. If you have a Ring system and are worried about your personal security, take a look at this recent New York Times article. From using a stronger password to installing a tracker blocker on your phone, they're sharing all the steps you can take to enhance your privacy.
How a Canadian start-up used AI to track the China virus
On the shores of Lake Ontario, a Canadian start-up raised one of the earliest alarms about the risk posed by the mystery virus that emerged in the Chinese city of Wuhan. How did it do it? Artificial intelligence. BlueDot has developed an algorithm that can sift through hundreds of thousands of news stories a day along with air traffic information in order to detect and monitor the spread of infectious diseases. The AI sent an alert to clients on December 31 about the coronavirus outbreak, days before major public health officials made statements. Learn all the details of the emerging AI here.
Information about 69,000 Phoenix pay system victims sent in error
More than 69,000 public servants caught up in the Phoenix pay system debacle are now victims of a privacy breach after their personal information was accidentally emailed to the wrong people, says Public Services and Procurement Canada. The system has improperly paid tens of thousands of public servants since its launch in 2016, leaving some employees without pay for months at a time. To make matters worse, earlier this month a report naming 69,087 public servants was accidentally emailed to the wrong federal departments. See the full details on CBC.
Should law schools offer formal education on eDiscovery?
While many law schools may mention the importance of technical skills in passing, not many students graduate with practical experience. That is until now. The University of Florida Levin College of Law has dedicated full-time faculty to teaching electronic discovery. Learn the details of their 2020 theme of "Work Smarter, Not Harder" and how you can attend their courses in person or online here.
Machine learning can't fix algorithmic bias. But humans can.
According to this last article, we can't blame algorithms for messing up. Instead, we have to look to their creators: humans. When technology like machine learning is designed, coded, built and scaled by a homogenous team, the results can be disastrous and even, quite literally, deadly. Quartz reports that autonomous vehicles might have a more difficult time detecting pedestrians with darker skin. The study they cite found that because those driverless cars were programmed mostly by young white male engineers, systems were 5 per cent less accurate when recognizing people who have darker skin than those with lighter skin, stemming from the fact that fewer images of people with darker skin tones were used during programming and testing. Get the full story, and learn what's needed to change it, here.