Silicon Valley Battles States Over New Online Privacy & Safety Laws for Children - Ep. 46

Download MP3
Jack sanker:

Welcome to Litigation Nation. I'm your host, Jack Sanker. And introducing my new co host to the show, who you will hear from in a minute, Vanessa Watkins. We're back from hiatus and in case if you don't remember, this is the show where we get you up to date on some of the most important and interesting legal developments. We'll keep you in the know.

Jack sanker:

You can find us on Apple Podcasts, Spotify, YouTube, wherever you get your podcasts. Danessa, welcome to the show. Tell us about yourself and tell us what we're covering today.

Danessa Watkins:

Thanks Jack. Yes. I'm so happy to be joining the show. Like you, I am a lawyer in Chicago. My practice here is primarily focused on First Amendment, defamation, and privacy law.

Danessa Watkins:

However, in joining the show I'm very excited to delve into different areas across the country and and keep everyone up to date on what's going on in in the legal world. So, today I'm going to be covering, actually cases across the country. 1 coming out of Massachusetts where the Supreme Court there recently ruled that no one under the age of 21 who commits a crime should have to serve a mandatory life sentence without eligibility parole. On the opposite side of the country, we have courts that are thwarting the state's efforts to pass legislation that would protect children from the unfettered use of online data.

Jack sanker:

And in New York, a federal district court just rejected a motion for attorney's fees after discovering that the attorneys relied pretty heavily on chat g p t for the basis of those fees. So it's another, negative ruling invoking the use of Chat GPT in in litigation. All of that and more, here's what you need to know.

Danessa Watkins:

Alright. So starting with our first topic here, we have a recent decision that came down in January of this year, 2024, from the Supreme Court of Massachusetts. And this was a sweeping finding that anyone who commits a crime that's under the age of 21 should not be subject to the mandatory life without eligibility of parole sentence. The court actually found that that sentencing guideline was unconstitutional. Now, the background of this case is we have, the defendant Sheldon Mattis.

Danessa Watkins:

He was convicted of murder in the first degree among some other charges, And his codefendant was, also convicted of first degree murder. However, at the time, he was 17, whereas Sheldon Mattis was 18. So the 17 year old offender did receive, life in prison. However, he had the eligibility of parole after 15 years. And because Mattis was a few months older, the the mandatory sentencing guideline was life in prison without parole.

Danessa Watkins:

Now Mattis appealed that decision and his argument was that the mandatory sentencing was actually cruel and unusual punishment. That as an 18 year old he should not be treated as an adult where his cognitive development was not that of an adult. It was more aligned with that of a juvenile. So this went all the way up to the Supreme Court, and at the Supreme Court level of the state of Massachusetts, the court found that they needed more information, background information, scientific information about where science is at at this point. When the legislator made these sentencing guidelines, it was 20 or more years ago and certainly our ability to get neurological testing, has has improved in that time period.

Danessa Watkins:

There was also a Supreme Court case that came down in 2012 and that's when the at the highest level they found that juvenile offenders, so that's under 18, could not be subject to life without parole for for any crimes. So relying on that decision, the the Supreme Court said we need we need more information about this age group of 18, 19, and 20. So at back at the trial court level, the the court heard from 4 experts over the course of a 4 day hearing. From the detailed evidence produced in the record the judge made 4 core findings of fact regarding the science of emerging adult brains. So first, emerging adults have a lack of impulse control similar to 16 17 year olds in emotionally arousing situations.

Danessa Watkins:

2nd, they are more prone to risk taking in pursuit of rewards than those under 18 and those over 21. 3, they are more susceptible to peer influence than individuals over 21 years old. And 4th, they have a greater capacity for change than older individuals that and that's due to the plasticity of their brains. Now I found this third finding somewhat interesting. So this idea of the effects of peer influence, I think we often just assume that high school kids are the ones that are most susceptible to peer pressure, but these scientific findings indicate that we really cannot discount how our friends may still influence us during early adulthood.

Danessa Watkins:

Now Jack, I wanna pose this to you because there are if you could think of situations where, you have, for example, a very bright college student. They are focused and determined to pursue their career. They do extracurriculars. However, they go on spring break. And they're on spring break with, you know, 15 of their best friends, and they wake up in the local jail the next day.

Danessa Watkins:

Any familiarity with those type of situations?

Jack sanker:

Not, the jail part, but it it seems like the thrust of this ruling is a understanding that from a developmental perspective and, I guess, psychological perspective, the line of, you know, 18 years is is probably a bit arbitrary and ex expanding it to, something that's more backed by our scientific understanding of, you know, when people are more fully developed and they're have more agency over their decision making and all of that. And it's it's really just moving that line back to better accommodate psychological development. Is is that the gist of

Danessa Watkins:

it? Yeah. Absolutely. We've drawn this line at 18, but if we look at other laws in this country, that 18 year old line is not so clear. So for example, someone has to be 21 years of age to purchase and sell alcoholic beverages, Purchasing a handgun, becoming a police officer, gambling.

Danessa Watkins:

I mean, there are different age limits for these things. So then why should we set the criminal sentencing at a hard line at 18? And that's what the courts are starting to look at. Massachusetts, however, has been the first one to say a sweeping, moving the line from 18 to 21.

Jack sanker:

And is there does it seem like this is happening in other jurisdictions as well, or is Massachusetts on an island here?

Danessa Watkins:

So Massachusetts is the first one to issue a ruling that is this sweeping, and I think certainly it will have implications for other, courts going forward. There is, of course, a strong basis for maybe some of the more conservative jurisdictions to not push that needle, and that really comes down to the, separation of powers. Because here we have a legislature who has said the sentencing guidelines are mandatory life in prison without parole for homicide once you hit 18, And we have a judiciary who is saying, no, we're gonna take a step back. We're gonna look at the scientific research. And of course they were looking at other considerations as well, But they said, no.

Danessa Watkins:

That's unconstitutional.

Jack sanker:

That's interesting that the the constitutionality of the sentencing guidelines is it seems like, a large part being based on, scientific scientific evidence. Like, you don't typically see, you know, courts latching on to expert testimony as a basis for finding a statute is unconstitutional.

Danessa Watkins:

Yes. Absolutely. And, I think that was a big part of the dissent. Now I should take a step back for a minute and say that this was a 4 to 3 decision before the Supreme Court, which in Massachusetts, the Commonwealth is typically, the Justices are very rarely split like that. So certainly the dissent, had a lot to say about the court overstepping.

Danessa Watkins:

In particular, justices, Lowery and Seifer, in their dissents, emphasized that great deference is owed to the legislature's right to define the punishment for criminal behavior and then define the line between juveniles and adult offenders. So they contend that whether a punishment is constitutionally disproportionate, considers whether the punishment is cruel or unusual in light of contemporary standards of decency. And then quoting from Gregg versus Georgia, which is a US Supreme Court decision from 1976, these justices said legislatures, not courts, are constituted to respond to the will and consequently the moral values of the people. So certainly there was pushback within the court itself. There have definitely been a lot of articles that come out since.

Danessa Watkins:

I'm sure there are some conservative jurisdictions that are concerned about what this means and what the implications could be, you know, for across the board, not just with sentencing guidelines. Another effect of this ruling is that it will instantaneously provide probably more than a 100 people that are currently serving life sentences with the eligibility for parole hearing. So certainly, the courts are gonna be flooded now with, with those motions, and I I would imagine that that would also dis swayed other jurisdictions from, you know, revising their their sentencing guidelines along those lines.

Jack sanker:

Up next, a law firm in New York filed a fee petition for roughly a 113,000 in attorney's fees and costs for the representation of a disabled child under the Individuals with Disabilities Education Act or the IDEA Act. And we're not gonna talk about the underlying lawsuit. We're gonna focus on this this fee petition. And the firm that filed the fee petition is a a firm called Cuddy Law Firm. They filed their petition under the IDEA Act.

Jack sanker:

There's an attorney fee recovery provision for the prevailing party. If If any of you familiar with prevailing party language, it's pretty standard statutory prevailing party language. And the fee petition itself on its face seemed pretty standard. Usually, for those of you that that know or have some experience of this, you track your time, submit an hourly rate based on a number of factors, including things like the skill involved, the novelty of the matter that you were working on, your reputation of the attorneys that are working on it, the complexity of the case, things like that. That's what you have to justify your hourly rate by, and then you just do the math and multiply that by the amount of hours billed.

Jack sanker:

In this case, the fee petition was based on a number of things. But one of the most interesting segments and the part that we're gonna highlight here is where the lawyers said that they in their filing, by the way, said that they use chat GPT to cross reference different fee databases as support for the proposed hourly rates. And this is what the district court had to say about that. Quote, it suffices to say that the Cuddy law firm's indication of Chat GPT as support for its aggressive fee bid is utterly and unusually unpersuasive. As the firm should have appreciated treating chap GPT's conclusions as a useful gauge of the reasonable billing rate for the work of a lawyer with particular background carrying out a bespoke assignment for a client in a niche practice area was misbegotten at the jump.

Jack sanker:

And later they say, in claiming that ChatGPT supports the fee award, it urges the Curti law firm does not identify the inputs on which chat g p t relied. It does not reveal whether chat g p t anywhere considered a very real and very relevant data point, which is the uniform block of precedent canvassed in the opinion below, in which the courts of this district and the circuit have rejected as excessive the billing rates of the Cuddy law firm that it urges for its timekeepers. This court therefore rejects out of hand Chatt g b t's conclusions as to the appropriate billing rates here, barring a paradigm shift in the reliability of this tool. The Cuddy law firm is well advised to excise references to chat gpt from future fee applications. So, Danessa, it seems like this is a pretty strong signal from the federal district court here that chat gbt has, like, no place being cited as a reference.

Jack sanker:

Any thoughts on that?

Danessa Watkins:

You know, I'm just getting up to speed with a lot of, this chat gpt stuff and how it's affecting our legal world, but I actually thought that was, kind of ingenious to I I mean, from someone coming from someone who has had to, prepare these fee petitions before. That's

Jack sanker:

right. Your line of work, you're you're probably always doing that. Right?

Danessa Watkins:

Yeah. Certainly. There it comes up a lot and trying to educate a judge on the going rate, particularly in the defamation world. You know, there aren't that many cases that go that far in defamation. A lot of them are either dismissed off the bat or settle.

Danessa Watkins:

So I find that it's it's not like your basic, you know, contract dispute or, a personal injury dispute. I mean these defamation cases it's, in my opinion, it's, a specialized area of law. So trying to explain to a judge why, you know, my rate may be different than the personal injury lawyers, is typically hard to do. So I think a resource like this that, you know, I don't know. I so maybe I disagree with the judge then is my point.

Danessa Watkins:

But, but I also, certainly won't pretend to have the expertise on this area to to say one way or another.

Jack sanker:

Well, what's interesting is and we covered this in episode 43, an instance, I believe also in New York, where a an attorney was using chat gpt to draft portions of a filing for for the firm. And this one made a lot of of waves in the news because, Chatt gbt was just hallucinating entire cases out of thin air and just really making things up. And so is a language model effectively an advanced autocorrect, which is kind of how chat gbt operates, asking it to, you know, create something, it seems like that's that's not an effective way to use it. However, I would I I tend to agree that maybe had they explained better what they were using as inputs and using ChatChibi you using it to take, you know, apples to apples and make a comparison of what is the same and different between 2 sets of data so long as you specify those datasets. That actually seems like a great use for it.

Jack sanker:

And I wonder if, at least in the legal world and this every time there's news about chat gpt or, this type of technology hasn't intersected with the law, it's almost always hugely negative. And I I wonder if now courts are seeing it as, you know, just a red flag in and of itself. Like, oh, you use chat gpt. This must be worthless. So I'm not going to use it on the basis of, you know, this is an inherently unreliable product.

Jack sanker:

And I think the opinion that in this case says it alludes to it as well. It says, as the firm should have appreciated, treating chat ChattGPT's conclusions as a useful gauge was misbegotten at the jump. So it's starting from the assumption that chat gpt is completely unreliable without really engaging in the way in which it could have been used properly in this case. I don't know whether it was or wasn't. I I I didn't read the fee petition itself, so I I'll take the court at its word that there is no specification of what the datasets Chat gpt was relying on.

Jack sanker:

However, if those datasets were specified and they are themselves reliable, and I think running them through this thing and and telling us, you know, how these 2 things compare 2 or more things compare, makes total sense to to automate that. And, you know, otherwise, you would just be paying someone, you know, to do that with an Excel spreadsheet. So I don't see how that would be any different. So I wonder if this is, like, a sign of the courts maybe being a little too harsh on this, on the use of these products here because of the negative reputation that's gotten in the news. I don't know.

Jack sanker:

And I think if you're going to be filing anything and you're mentioning this technology, the best practices here are to include an explanation to the extent that you can on how this thing works and what its output is being based upon. And I because, I mean, let's face it. Your your goal is to be educating the federal judiciary, in this case, who are not young people and who who, you know, are are gonna gain they're gonna come to their conclusions about a new technology one way or the other, and that's gonna be their conclusion about that technology for the rest of their careers. So you have to you can't just put it out there and say, hey. We use this technology and that's a good thing.

Jack sanker:

Like, everyone knows, you need to explain to them why why it works.

Danessa Watkins:

No. I agree. And, so I go to an annual ABA conference on on first amendment law, and, I just went this past month and AI was a huge topic. And, you know, you heard all different perspectives of it, but one thing that really stuck with me is that anything like this, any AI is only as good as the data you put in it. So

Jack sanker:

Yeah. Garbage in, garbage out. Right?

Danessa Watkins:

Yeah. Exactly. So I think you really guess if we, lawyers, are going to be using this technology, we need to help educate the judiciary on what it is, you know, how it's used. Obviously, everything is not created the same, and, just like we go through these continuing legal education courses and whatnot, I think, now it's become so pervasive that the judges also probably should be, you know, doing some of their own training on on what's out there for this technology, how it can be used, how it can be misused, but we can assist with that as lawyers as well.

Jack sanker:

Yeah. And, I mean, if anyone has been listening to the show long enough, I am pretty consistently bearish on AI as applied to the law. I don't think it's a massive paradigm shift in and of itself. I think it'll change how things are done in the margins like anything else does. But I I don't think that it's, you know, gonna wipe out all the attorney jobs all of all of a sudden or

Danessa Watkins:

Well, I hope not. Right.

Jack sanker:

But, and I I think I've I've made this joke on the show before, but everyone who was, like, super into crypto and Bitcoin up until about 2021, those same people are now all about AI. So I think that tells you, you know, kind of the nature of of the, trendy tech, that we have going on. But I if you're going to stick your neck out there and say to the court, I'm relying in part on this technology to come to, you know, this conclusion about what's a fair and reasonable rate for my fees, you do owe it to the court and to yourself because you're the one who's asking for those fees to explain how the technology works. And from what I gather, they didn't do a very good job of that, and the court seems to be already predisposed to a negative understanding of how this works. And that's that's how the that's how the ruling went.

Jack sanker:

By the way, it was a the fee petition was for a $113,000. It was reduced to under 60. So that that would hurt.

Danessa Watkins:

Absolutely. Alright. So on to these efforts that states are undergoing to try to pass this legislation that will protect children from the use of their data. We have a slew of litigation that has been filed across the country against these platforms, Meta which is Facebook and Instagram, Snapchat, really a lot of them are on the social media sites that are most attractive to children or adolescents. There's been a real pushback on these companies not doing enough to protect what they know are the the children users of these programs.

Danessa Watkins:

So most recently in December of 2023, there was a lawsuit brought by the Attorney General of New Mexico and he was suing Meta for its lack of age verification. So they're allowing teens and preteens both to register for these accounts. And when they do, there are algorithms that are already in place that will direct these sexually explicit content to those users. The complaint was 228 pages, so I will admit that I did not read the whole thing. But this seems to be what's kind of happening across the country.

Danessa Watkins:

We've got suits from attorney generals. We've got, even derivative actions coming from the shareholders of these companies that are claiming that these executives are opening them selves up the company up to liability both on a civil and a criminal level.

Jack sanker:

The shareholders are bringing or Oh, that's interesting.

Danessa Watkins:

Yeah.

Jack sanker:

They're saying you're you're putting the value of the company at risk by this practice. That's very interesting.

Danessa Watkins:

Yeah. And actually just recently, Mark Zuckerberg, the CEO, yes, of Meta, he, filed a scathing pleading before the Delaware Chancery Court, saying that everything the shareholders are saying is just based on, like, the news essentially, and that they don't actually have the data for saying that the company is, you know, putting out this content to teens without any restrictions.

Jack sanker:

Yeah. I mean, Meta Facebook Meta, has taken a lot of hits, with respect to issues kinda similar to this, some of which we've covered on the show. But, I mean, Metas, I know their facial recognition program, which if you remember when you post a picture picture to Facebook, it it would give you suggestions on who to tag or whatever, facial recognition. That was brought down and essentially stopped, as a direct result of Illinois Biometric Protection Act litigation and, like, I don't remember offhand, but a couple $100,000,000, judgment, and that similar results in places like Texas, if I recall. And yeah.

Jack sanker:

I mean, it one, it's amazing because, like, they keep getting crushed in these cases for 100 of 1,000,000 of dollars. It just shows how much money they have. Like, they're apparently very liquid. And but also, I I mean, it's open season. Because I think what has happened is up until, like, yesterday, Meta was doing all of this stuff, which everyone agrees is is bad.

Jack sanker:

And so, you know, going forward from a certain point when people started paying attention, seems like maybe around 2018 or so, they're more conscious about privacy, content moderation, things like that. But anyone who wants to file a claim going back to when actually people used Facebook, which was before 2018, all those claims are still out there. You know? So that's they're they've kinda still got, you know, the exposure from their the 15 prior years of business practice before anyone started caring about this.

Danessa Watkins:

Yeah. Absolutely. And and certainly, Congress is getting involved now. We've seen a bunch of congressional hearings. I also read that a former Facebook engineer, he called META's safety features placebo, claiming that the executives knew that these managers were intended to protect children, but that they have repeatedly failed.

Danessa Watkins:

So, yeah, the government's paying attention. But the story that I actually wanted to talk about was what efforts the states are making to try and pass legislation that will do more to protect children online and make sure that their data isn't being used the same way as adults. Particularly when you think about, you know, when you sign on to these free platforms, let's just take Facebook because it's an easy example. So you sign up for these free social media sites and you're presented with these terms of service. I mean, even as a lawyer, I'm, like, reading through these and my eyes are going cross.

Danessa Watkins:

So And

Jack sanker:

did you actually read them?

Danessa Watkins:

Well, I think I probably started paying attention a little bit more after I went through law school because, you know, then you become a contract nerd a little bit, but Yeah.

Jack sanker:

I totally agree

Danessa Watkins:

with you too.

Jack sanker:

Every word.

Danessa Watkins:

But, but I mean think about that and how hard it is for us as adults, and then those are not in language that kids can understand, and yet they're potentially signing away their, you know, rights to the control of their data. And for anyone that doesn't know, I mean, these online providers, they rely on advertising in order to earn their revenue, and those advertisements are targeted to users based on their interests. So that information comes from the data that is collected by these companies when the users are online, and then the advertisers can, you know, tailor who they're sending content to and it it's on an individual basis. So you have these children who are looking at things without realizing that all of that data is being collected, to something that took them somewhere they didn't even wanna go, but they were there for a second. So now they're gonna get, you know, advertisements and content based on that.

Jack sanker:

Yeah. The, the the algorithm, if you will, that just you know, it's a positive feedback loop of whatever it is that things you like or interested in. It just shows you more and more of it.

Danessa Watkins:

Alright. So what California has tried to do is they the legislature passed, and this was a unanimous through legislature, through the governor, what is California's Age Appropriate Design Code Act, CAADCA. And this affords protections to children essentially when they access the Internet, but the act is focused on those for profit businesses that are collecting the consumer's personal information and using that essentially for profit. So this act was supposed to become effective as of July 1st this year, 2024, and it imposed a number of requirements on these businesses that provide online service, product, or feature that is likely to be accessed by children. So we already have a federal child privacy law in place.

Danessa Watkins:

It's, COPPA, which is the Children's Online Privacy Protection Act. And this limits the ability of online providers to collect personal information from children. However, it is focused on those websites that are directed towards children. So, has a much narrower focus than the act that California was trying to put in place because California was focused on broadly any online service that is likely to be accessed by children.

Jack sanker:

It's interesting that it's California too because it's not federal regulation, obviously, but California is the hub of each of these companies, websites, apps, whatever. So they all live and work there and probably most were founded there. So regulations at the state level are gonna have national ramifications. Right?

Danessa Watkins:

Yes. So what the CAADCA does is it requires these companies to configure all of their default privacy settings to the highest possible level of privacy, and they have to put their their privacy policies in terms that children can understand. Now COPPA, the federal statute, gives parents authority to make decisions about use of their children's personal information. But in contrast, the CAA DCA requires that online provider create a data protection impact assessment, a DPIA, report, which identifies for each online service that is likely to be accessed by children, any risk of material detriment to children arising from the data management practices. Providers would then need to create a plan to mitigate or eliminate those risks before the service can be accessed by children.

Danessa Watkins:

And under the act, those reports would have to be submitted to the California Attorney General by July 1, 2024. So, obviously, this is a big undertaking for those online providers. And then specific to the data that's being collected from users these corporations are also going to be barred from using the children's person personal information for any reason other than the reason for which the personal information was collected unless they can demonstrate a compelling reason of why use of that data is in the best interest of the children.

Jack sanker:

So that would basically nuke the ability to advertise based on data for collected from children.

Danessa Watkins:

Exactly. Yes. So, obviously, a lot of these big companies, took issue with this once it was passed, specifically Net Choice LLC, which is a national trade association of online businesses that share the goal of promoting free speech and free enterprise on the Internet. So some of these members are Google, Amazon, Meta, TikTok. They they're the ones that brought this lawsuit before, the district court in California, and they were seeking an injunction to find that this new law, CAADCA, violated a bunch of their constitutional rights.

Danessa Watkins:

The one that I'm focusing on here is the First Amendment. Alright. So Net Choice LLC's main argument was that this law essentially requires them to narrowly tailor their speech, prior to putting out their creative efforts, in a way that you maybe is not even required because they don't know who is gonna be accessing it, and they're forced to, take these steps to limit their content before they even know who the end user is.

Jack sanker:

So it's prior restraint on speech is what they're saying?

Danessa Watkins:

Correct. Yes. A prior restraint. Now they made a a lot of other arguments. However, the district court judge did glom on to that first amendment argument, saying that what this act effectively does is restricts the availability and use of information by some speakers but not others, and for some purposes but not others.

Danessa Watkins:

So it is a regulation of a protected expression under the First Amendment. And that undertaking tasks like assessing how a business model may harm children and providing certain information to users also involves the regulation of that covered speech. For example, what is considered detrimental to the health and well-being of 10 year old will be different than what's potentially harmful to a 16 year old. So even with this issue of advertising, the court found that treating an ad as harmful to all users under the age of 18 is likely burdening substantial more speech. Remember, advertisements are considered speech under the First Amendment, and is necessary to further the government's legitimate interests.

Danessa Watkins:

So the nuances of the discussion on the under the First Amendment in this opinion is really interesting. The judge was forced to balance the rights of commercial speech against the protection of children online. So I definitely encourage any of my fellow free speech nerds to read the full opinion, which is NetChoice LLC versus Bonta, b o n t a. And that's out of the Northern District of California. So just to provide an example as to why certain legislators and governments are viewing this as a serious issue.

Danessa Watkins:

There was a study that was put out by Harvard TH Chan School of Public Health, and it found that social media companies made nearly 11,000,000,000 in advertising revenue from US based minors in 2022 alone. So ultimately, a very complex issue. Certainly we know it's difficult to write laws that will pass constitutional muster. This was one state's first attempt to regulate what they see is a problem and try to address it. It is up on appeal right now.

Danessa Watkins:

The district court did grant the injunction to the company and found that the statute as written was unconstitutional. What I think we're gonna learn from this is that other states are gonna view this as, you know, sort of a checklist of what to do, how to write their laws in such a way that they're not so narrowly tailored that they're going to eventually pass this constitutional muster. I I certainly don't think this is the last that we see of a state trying to regulate these companies' ability to, you know, profit off of minors using their services. And as this case develops, we'll we'll certainly talk about it more in the show and keep you updated.

Jack sanker:

That's the show everyone. Thank you so much for listening. Danessa, we're so happy to have you on the show going forward. It's gonna be a lot of fun. We have a lot of really interesting topics planned for the next couple of weeks.

Jack sanker:

As always, you can find us on, Apple Podcasts, Spotify, YouTube, wherever you get your shows.

Danessa Watkins:

If any

Jack sanker:

of the topics you hear about on the show are interesting to you or affect you or you have any questions for us, go ahead and put something in the comments on any one of those platforms and we'll talk to you in 2 weeks.

Silicon Valley Battles States Over New Online Privacy & Safety Laws for Children - Ep. 46
Broadcast by