đ¤ Deepseek AI in The US Justice System âď¸
- What would Possibly Go Wrong? đ¤
Just when you thought American justice couldnât get any worseâwhat with its backlog of cases, questionable sentencing, and the general vibe of Zootopia's city-wide discrimination problemâsomeone decided to throw AI into the mix.
Of course, what the system really needed wasnât reform, accountability, or, I donât know, maybe even actual fairnessâbut instead it was robotsđ¤Ł
âď¸ Welcome DeepSeek AI: The Robotic Judge
Enter DeepSeek AI, the latest technology of
âwe swear this will make justice fairerâ
Though its a system designed to analyse legal data, predict sentencing outcomes, and help judges in making more efficient decisions.
Obviously, the real problem with the system of prosecution was its slowness.. ?đ¤ĄBut hang on a minute, it gets better - DeepSeek AI was developed in China.
Government run AI surveillance and social credit scores - now training an algorithm to influence US court rulings..?
Just to be clear -
âĄď¸ Traditional Credit Score: Based on financial history (loans, credit card payments, debts).
âĄď¸ Social Credit Score (China's System): Includes both financial history and personal behavior (e.g., legal records, social actions, online activity).This is like trusting Jafar with the Genieâs lamp and expecting things to go well.
â DeepSeek AI Just Got Banned in the USâFor Good Reason
Surprisingly, the US government did something sensible (for once) and shut the whole thing down before it could Aladdin-level backfire.
đ Why the Ban?
1ď¸âŁ Data security risks â Who controls the data? Nobody really knew. And when the legal system is involved, thatâs just a teeny bit of a problem..
2ď¸âŁ Foreign influence â Imagine an AI trained in a completely different legal and ethical system influencing sentencing. Itâs giving Scar taking over Pride Rock vibes.
3ď¸âŁ Shady track record â DeepSeek AI had already been caught mishandling user data. Because whatâs a justice system without a little casual privacy invasion?
So, to recap: The US government almost let a foreign-developed AI decide sentencing outcomes for real people.
đ¤ Because nothing screaaams a very "trustworthy justice system" like outsourcing its punishment decisions to algorithm trained in a surveillance state.â ď¸ The Algorithm Will See You Now
Now, picture this:
Youâre in court over something minorâsay, forgetting to pay a parking fine. Instead of a judge reviewing your case, an AI spits out a decision based purely on past data.
"Based on historical sentencing patterns, you should receive a $500 fine and two months in prison. Have a nice day!"
The Problem? AI Learns From a Broken System
đŽ Rubbish in âď¸ Rubbish out â If the system is historically harsh on minorities or low-income defendants, AI will just double down on that bias.
đNo human oversight â A judge (at least in theory) can explain their reasoning. However, AI? Not a chance.
đ A very opaque decision-making â If an algorithm makes a mistakeâor, worse, starts dishing out life sentences systematicallyâwho do you appeal to? Siri?
And this isnât just a hypothetical nightmare. Itâs already happened.
đ¨ Remember COMPAS? The Racist AI That Already Screwed Up Sentencing?
For those of you who don't know, back in 2016, the US legal system did try AI sentencing, andâshockerâit was hugely racist.đ
AI existed a decade ago⌠yes I know rightA tool called COMPAS was used to predict whether people were likely to reoffend. It systematically flagged coloured defendants as high risk more often than those notâeven when their criminal records were identical.
And yet, courts actually used it to decide bail and sentencing.
Now, imagine that same flawed systemâexcept with even less oversight and developed by a country famous for AI-powered mass surveillanceđŹ
đ¨ Congrats, America! You just invited Syndrome from The Incredibles to automate your legal system. đ¨
đ Oh, and DeepSeek AI Has Already Had a Massive Data Breach
Because nothing says reliable like a sentencing AI that accidentally leaks private case details.
Reports suggest DeepSeek AI was harvesting user data and could even have links to government surveillance programs.
Now, picture this:
Youâve been sentenced to 10 years. Also, the CCP now has your search history. Have a nice day!
Byeeeđ The US Government Almost Made This Worse
Because, of course, they did.
Before the ban, there were serious discussions about expanding AIâs role in sentencing. The idea? More "consistent" punishments.
Because, obviously, the biggest issue in the US legal system isnât mass incarceration, or the fact that people are in prison for non-violent offencesâitâs that their punishments werenât handed out consistently enough. đ
đ§ Why This Is a Catastrophically Stupid Idea
â No transparency â Who audits these AI models? If you get an unfair sentence, who do you appeal to - ChatGPT?
â Automated discrimination â AI doesnât eliminate bias; it just applies it with mathematical precision.
â Who thought robots should hand out prison time?! At what point did someone decide that a computer should determine whether someone gets five years or fifteen?
And letâs be clear: This isnât some futuristic debate. AI is already making legal decisions. And itâs already screwing people over.
đŽ AI in the Legal System: Tool or Tyrant?
To be fair, AI could be useful in courtroomsâif used responsibly.
â
It could identify biased sentencing patterns and expose unjust laws.
â
It could streamline legal research, making justice more accessible.
â
It could assist judgesâbut it should never replace them.
But in the hands of cost-cutting bureaucrats and tech evangelists? Itâs a dystopian disaster waiting to happen.
đ¤đThis Is How You Get RobotCop Courts
I love a good sci-fi dystopia as much as the next person, but handing legal decisions to an AI with shady data practices? Thatâs literally Skynet running the courts.
Until we have full transparency, oversight, and ironclad data protections, this isnât justice reformâitâs institutional laziness disguised as progress.
If the US actually cared about fair sentencing, maybeâjust maybeâit should focus on reducing mass incarceration instead of outsourcing prison time calculations to a computer.
But hey, what do I know? Iâm just a human. Not an algorithm.
đŹ So, should AI have a role in sentencing? Drop your thoughts below or message me with your rants âbefore an algorithm decides if youâre allowed to comment. âŹď¸



đ¨