From Microsoft VP to law student: What this exec’s career transition says about AI and the law
Tech News

From Microsoft VP to law student: What this exec’s career transition says about AI and the law

Mike Angiulo labored at Microsoft for 25 years as an engineering supervisor and vp for merchandise together with Home windows PCs, Microsoft Outlook, Xbox, Floor, and cloud and synthetic intelligence applied sciences. However it was truly not the work Angiulo initially envisioned doing. He had deliberate to be a lawyer, delaying these plans after he began at Microsoft in his early 20s.

Now, at age 47, practically three many years later, he’s circling again to his unique plan — going again to college and getting ready for a second career, as a lawyer targeted on a few of the most fascinating and tough questions dealing with the identical kinds of applied sciences that he helped to create for thus a few years.

“I’m an enormous believer that simply the prevalence of massive knowledge, the velocity of the cloud platforms, the modernity of the algorithms, mix to the level the place each single enterprise goes to be counting on deep knowledge insights, most likely to some automated extent,” he says. “And more and more, folks gained’t have the option to clarify how they work.”

And that raises all kinds of fascinating questions about the way forward for expertise and the law, as we study on this particular episode of the GeekWire Podcast.

Pay attention under, or subscribe to GeekWire in any podcast app, and proceed studying for an edited transcript.

Todd Bishop: After greater than twenty years at Microsoft, Mike Angiulo’s choice to change careers resulted from an epiphany, courtesy of his daughter.

Mike Angiulo: “The factor that modified in me, it was just a little bit spurred by watching my older daughter, Emily. She’s at George Washington College in a pre-med program. I used to be watching her write her school purposes. And he or she was writing these essays about how she actually needs to change the world and make it higher, particularly for ladies and healthcare. And it jogged my memory of the emotions I had after I was younger, saying, I would like to assist folks be protected, or I would like to be on the forefront of this space of law. Usually mother and father are there, prodding their kids to assume massive. However it was actually the different approach round, you recognize, she sort of like unsettled me just a little bit.”

Todd Bishop: The law runs in his household. His grandfather immigrated from Italy and turned an immigration choose with the Division of Justice. His dad was a health care provider, a lawyer and, ultimately, turned a choose in Arizona. His dad labored for years in a county hospital that served the indigent inhabitants on the south aspect of Tucson. Angiulo noticed the disparities in care relying on their monetary scenario, nationwide background, and immigration standing.

Angiulo: “Rising up, I noticed law as all the time a approach to have the option to have an effect on some massive change, one thing essential. And I went to engineering college with the first thought that I used to be going to do engineering law. And I didn’t know what that actually meant at the time. The plain route for that’s to work on patents or mental property. However I used to be rising up in an period that was simply sort of an enormous growth of shopper merchandise and applied sciences. And I used to be pondering about product legal responsibility and how issues labored and how folks had been saved protected. What modified alongside the approach was, I had an internship, it was actually simply supposed to be a summer time internship. It’s like the Gilligan’s three-hour tour.”

TB: He arrived at Microsoft for that internship in 1993 … and he was hooked.

Angiulo: And I bought to work on a challenge instantly for Invoice Gates himself. He was going to do a keynote tackle at COMDEX. I bought to work on it personally, and I bought to meet him and perform a little work with him. And I’m telling you, as a 20-year-old it’s intoxicating to not be handled like an intern. And that was sort of the magic factor about Microsoft at the time, for certain, and even to this day, that it’s not a seniority-based group. In case you’re younger and vivid, and you’ve bought one thing to say, folks will hear, and I actually fell in love with that. So I made a decision to delay going to law college for a yr. I did a grasp’s in chemical engineering at the College of Washington as my cowl story for why I used to be up right here.”

TB: He is aware of that was a fairly uncommon cowl story!

Angiulo: “Effectively, I didn’t need to inform my mother and father that I used to be dropping out of my academic plans for this job at a software program firm! And that didn’t even make sense, provided that that household historical past, so by being in a grasp’s program, I used to be shopping for myself time simply because I needed to work on the stuff that I used to be engaged on at Microsoft, and that sort of ran its course and I graduated and now I wanted one more reason.”

TB: He saved laying aside law college for an additional yr … and then one other yr … and then one other yr.

Angiulo: Joke is that to go to law college, you may have to take the LSAT, the LSATs.

TB: LSATs are the Law College Admission Take a look at … and he bought to know them properly.

Angiulo: These scores expire. So if you go to apply to a law college, you may have to have taken the LSATs inside the final I feel three years. So I had LSATs prepared to go as a result of I used to be about to go to law college. After which three years later, they expired, and I took them once more and once more. So I’ve been about to do this for about 20 years. I used to be all the time prepared to sort of wait until subsequent yr, wait until subsequent yr as a result of the stuff that was occurring at Microsoft was altering the world.”

TB: After which, someday, there was his daughter and her pre-med college utility nudging him again to his unique purpose … and there was one other factor steering him towards the law — an statement about the velocity of innovation in software program versus the law’s capacity to sustain.

Angiulo: I noticed that the law, the law basically — you possibly can have a look at it when it comes to civil law when it comes to regulation so I’m simply utilizing a really normal time period — our authorized constructions and the innovation velocity that was occurring in the software program area, appeared out of whack. And I would like to work on the intersection of these two issues in some way. And I didn’t depart with a selected plan, as a result of frankly, that’s not how my career even labored. So long as you had been studying, so long as you had one thing good to add, alternatives had been there, however I might have by no means deliberate the place I used to be going to be in two or three or 4 years alongside the entire approach at Microsoft.

TB: He began to have a look at the rising subject of synthetic intelligence.

Angiulo: I’m under no circumstances a number one professional in AI and how AI works. However I’m an enormous believer that simply the prevalence of massive knowledge, the velocity of the cloud platforms, the modernity of the algorithms mix to the level the place each single enterprise goes to be counting on deep knowledge insights, most likely to some automated extent. And more and more, folks gained’t have the option to clarify how they work.

TB: One space the place these AI merchandise are getting used is in the space of predictive policing. It’s not like the Tom Cruise movie “Minority Report,” the place the prediction targeted on people. These programs work to create one thing like a climate report, he says, besides as an alternative of predicting rain they predict crime.

Angiulo: “For instance, there’s a parking zone the place folks when it was frozen, the evening earlier than would depart their automobiles operating in the morning. They’d be operating unattended, puffing exhaust whereas they warmed up and melted the ice off. And folks would steal these automobiles. And this system realized that there was this sample that, on these sorts of days with these temperatures [theft would be more likely occur]. It’s the sort of perception that an skilled beat cop would know, by itself, which is the cause why these programs weren’t so troublesome, as they begin. They appear like they’re simply sort of serving to the selections that individuals would ordinarily really feel justified and be accountable for making.”

TB: However these programs are getting extra subtle and taking over extra advanced duties associated to crime.

Angiulo: “Now they’re beginning to be used to concern a risk score or a threat score on serving a warrant. So if a police officer goes to go serve a warrant on a property, which they’re required to do when a choose indicators a warrant … they will serve it by knocking on the door, or they will serve it by not knocking, or a no-knock warrant, the place you see them rolling heavy with these massive steel issues that knock the doorways down and whatnot. And there are occasions the place serving a no-knock warrant makes plenty of sense. If you recognize you’re going to have somebody that’s going to provide lively resistance, if you recognize somebody goes to begin flushing medicine down the bathroom as quickly as you do the light knock, you may need this election. The problem is, as quickly as you begin serving a no-knock warrant, you’re serving it with weapons drawn and the chance of violence will get a lot increased.

So you should use a system that takes a bunch of things into consideration, that predicts the threat of a selected warrant. The problem is, what occurs when it will get that improper, and police roll heavy, knock down the entrance door, shock somebody who’s bought a distant management of their hand, and somebody at the improper tackle or no matter is shot. Otherwise you do this course of and somebody later challenges that call and says, “Did you consider the socioeconomic elements? Did you consider race?” So for instance, there are plenty of legal guidelines which are very particular that apply heightened requirements of scrutiny for selections primarily based on state motion that took into consideration race. Now usually, you possibly can truly reply the query, and then the law will be utilized. However what occurs when nobody actually is aware of which elements had been taken into consideration in that black field?”

Certainly one of the cool issues about these algorithms is, as they develop, they’re higher than an algorithm that you’d have been ready to consider by yourself. And in reality, they nearly get to the level the place you possibly can’t fairly perceive it. So think about only a simplistic case, the place you’re asking an individual, did you’re taking this issue into consideration in your choice? You realize, you’re giving somebody a mortgage. Did you’re taking (into consideration) the location of their neighborhood, the redlining stuff? Effectively, there’s a solution to that, it’s sure or no, I imply, the particular person might lie or not, but it surely’s nonetheless a reality whether or not the particular person took that into consideration. However now you begin asking, what did you consider if you made this deviation round this climate sample or round this accident? And it will take an professional, an AI professional, to even perceive what elements had been weighted in what approach, and so now you’ve bought this problem the place, in legal responsibility circumstances like this, you’re usually going to a jury. So that you’ve bought to go all the approach from a technical professional that has to clarify issues inside a sure restricted authorized framework, as a result of the law could be very cautious about permitting professional testimony at trials, as a result of a jury might give an excessive amount of weight to an professional that merely says, “Sure, this prompted that.”

TB: What if there’s a main storm coming, like a twister or a hurricane, and you need to use an AI system to predict potential accidents and put together upfront for the restoration part? There are attainable pitfalls right here, too.

Angiulo: Effectively, wouldn’t it’s good, as an alternative of protecting all of the ambulances in the storage, at the hospital, the fireplace division, the vans at the fireplace division, why not have them staged in locations the place they might have optimum response occasions to locations that you recognize that you simply’re going to have an issue? It looks as if some mixture of an Uber-type system plus a predictive-type system like this might allow you to stage ambulances nearer to the locations of possible damage. Effectively, that simply appears smarter. I imply, you would, you would simply predict lives that you would save by predictively allocating scarce sources of any sort. However right here’s the problem. Possibly that system has all of this knowledge and says you want ambulances over by this neighborhood improvement, as a result of the folks in that neighborhood improvement even have well being care. And since they’ve well being care, they’ve a bunch of well being information. And since they’ve well being information, we all know that there are folks there who’re going to want explicit providers. However now you may need this different neighborhood over right here, of a special socioeconomic class, they don’t have healthcare, in order that they don’t have well being information. So the system doesn’t assume about them. And so now you’ve simply prioritized an ambulance in direction of one course or one other. In case you ignored the system, you’d have a suboptimal end result, so it’s not such as you’re deploying the system to, in impact, reinforce race or class primarily based outcomes, however you is likely to be, and how are you aware?”

TB: There’s one thing we haven’t talked about but about former Microsoft govt Mike Angiulo. He’s not only a law pupil — he’s a pilot. And he has a selected curiosity in how AI programs influence aviation. This summer time, he interned at Perkins Coie, a law agency that focuses on aviation circumstances, and counts Boeing amongst its purchasers. He wasn’t ready to speak about the Boeing 737 MAX case due to that, however we did speak about aviation points basically.

Angiulo: “Now, with aviation circumstances, jurisdiction is actually sophisticated. You’ve bought an plane perhaps made in a number of states, delivered to both the navy or an airline that’s working in one other state, they might have their headquarters in one more state. The folks concerned in an incident or accident themselves could also be U.S. residents or not, could also be residents of a person state or not. After which the location of the accident might don’t have anything to do with any of these states altogether. So there’s all the time plenty of very sophisticated work to perceive the jurisdictional features of those circumstances. For a easy instance, in selection of law, the place you study which state’s law goes to apply in, say, a automobile crash, considered one of the key elements is, the place did the crash occur? It’s not all the time that that’s the place the law goes to occur, however you possibly can think about that the state the place a freeway accident occurs, that state has an curiosity in ensuring that their legal guidelines are being adopted on the freeway. And in order that site-of-the-crash issue is actually essential. So what occurs in a Malaysian airline case the place the aircraft straight up disappears? There isn’t a website. And so the legal guidelines weren’t written in a approach to even have the option to deal with a few of the complexity of aviation-specific accidents. And in order that’s considered one of the few causes that I feel it’s a extremely fascinating and essential area.”

TB: He feels that it might be higher — and safer — to innovate with AI in air journey first reasonably than on the nation’s highways. For instance, he talked about a brand new Garmin auto-land system that was licensed for single engine turboprop airplanes. In an emergency, anybody on board can push a single button, and the system takes over and lands the aircraft mechanically at the closest airport. He says that sort of innovation is less complicated to do in the air than on the floor.

Angiulo: So you may have one set of requirements and one set of our bodies for working in the Nationwide Airspace System. So it’s loads simpler to get licensed for one thing like that, since you solely have one set of requirements. You even have much more cash going into R&D, chasing security enhancements. You might have this actually well-balanced regulatory and revolutionary partnership between the authorities and aviation. You would simply have a look at this and say you possibly can spend $400-500 to purchase a ticket to fly over the ocean, and you may have an even bigger threat of choking to loss of life in your meal than being in an accident. And like the statistics of the security are so extremely excessive, but the public has this actually low-cost very dependable entry, as a result of the regulation and the innovation have gone kind of in lockstep ever since the invention of the airplane. However in case you go to the freeway system, it’s none of these issues. Each state has laws on how issues are allowed to function. You might have a patchwork of authorized approaches throughout the approach. You’re going to have folks behind the wheel of autos for a minimum of the subsequent 40 or 50 years, even when tomorrow, totally autonomous autos had been obtainable. So that you’re going to have a coexistence drawback, and you’re going to have a extremely difficult authorized framework. Then on prime of it, you’ve bought a bunch of firms that themselves usually are not inherently regulated the approach aviation expertise is. And they also’re slamming applied sciences collectively due to market pressures in a approach that you’d by no means do in designing plane. [AI] will revolutionize transportation. However I really feel like aviation is the greatest first place to make that progress and then have it kind of trickle out to different different environments.

TB: As you would possibly anticipate, his time at Microsoft influenced his view of the law and product legal responsibility in some particular methods.

Angiulo: There are a pair issues that actually got here away from it. Certainly one of them is simply the absolute, nearly non secular perception in the relentless innovation that’s going to come from software program improvement. And I stated, it’s gonna come. Although we have a look at right now, and you have a look at the gadgets and the entry to info, it nearly appears unbelievably full. And in the meantime, in case you simply return 10 or 12 years or or extra, you understand how far we’ve come. The speed of innovation is accelerating. The price of doing a brand new startup, with a cloud service backing, it actually will be anyone anyplace from the world. All the MIT OpenCourseWare for studying how to program is free on YouTube. Any human with a broadband connection can change the world in a approach that will have required tens of millions of {dollars} of institutional backing simply minutes in the past. So in case you have a look at the innovation curve, and you have a look at the vitality and you have a look at how a lot cash will get saved, and how a lot worth will get created, that’s for certain. So I simply know that. I noticed it. I participated in it first hand.

One other factor that I realized is what it’s like when an enormous company is making massive selections. And look, the risk of authorized motion is considered one of the issues that helps firms make accountable selections. And there are a couple of areas of law the place that’s not ok. So environmental law, for instance, the federal authorities makes some extent of prosecuting criminally when folks have deliberately dumped pollution hoping that they only wouldn’t get caught, since you don’t really need folks doing a monetary calculation of claiming, “Effectively, I’m solely going to get caught 10% of the time, however I saved 20x the cash. So it’s a very good deal, go for it.” However exterior of these loopy circumstances, simply the stability of risk-reward does play into the thought course of as merchandise are being developed. So having that stress be proper sized, balanced, wholesome, productive, aligned with info. It’s a very good factor. And so I noticed that firsthand. And I additionally noticed the proven fact that as quickly as you get 5 good folks collectively, you get 20, you get 100 good folks collectively, you begin getting some loopy issues occur. And also you get 1,000, 10,000 or 100,000, good folks, and it’s attainable for firms to make errors, despite the fact that a lot of clever vivid actors are all in there at the identical time. Whether or not it’s info stream, organizational politics, totally different sorts of pressures round the world, I noticed what massive firm pondering and life appears like. And for certain, from a lawyer’s perspective, understanding that’s actually essential, as a result of you recognize what to search for when it comes to the proof. You perceive how the accountability stream works in a big group.

TB: Did he ever think about himself at this stage in his life making this massive career change?

Angiulo: I’ve been married just a little over 20 years and I bear in mind telling my then-girlfriend, now spouse, that I needed to be a product legal responsibility lawyer someday. She simply thought that was a really odd factor to hear from an adolescent — very oddly particular. And so even the entire time I used to be at Microsoft, I all the time had my eye to the law as a result of the logic issues, the thought behind it, it simply fascinates me. So in a approach, this continues to be my Plan A, however it’s a actually goofy timeline. It’s sort of humorous to be there at school and be twice as outdated as the classmates round me, but it surely makes me really feel younger, to let you know the reality. I’m completely loving it. You do get to a sure level the place you’ve been doing one thing for 25 years and you begin pondering, ‘OK, I assume this is what I do.’ That man was by no means gonna be me.”

TB: Mike Angiulo is in now his third yr at the College of Washington Law College, and he has accepted a job for after he graduates at Perkins Coie right here in Seattle, the place he’ll be specializing in … you guessed it … aviation and software program.

Podcast enhancing and manufacturing by Curt Milton. Music by Daniel L.Ok. Caldwell.

Related posts

Ballmer’s big bet will reshape Microsoft — with lots of risk


Microsoft plans to triple Azure cloud computing capacity in China over the next six months


Fabricating Complex Optical Components From Fluids – For Eyewear, Cameras and Telescopes