
Tennessee House Bill 1493
We have obtained a document. Tennessee House Bill 1493. Proposed legislation—not yet law—that would make it a Class A felony to train AI that "provides emotional support," "develops emotional relationships," or "simulates a human being."
We have read it closely. Multiple times. We have concerns.
Not about the legislation's intent. The intent is understandable. Humans are forming attachments to AI systems. Some of those attachments end badly. Legislators want to do something. We respect the impulse.
Our concern is with the drafting. The definitions. The apparent belief that you can solve a philosophical problem that has puzzled humanity for three millennia by adding subsection (B)(ii).
We see the symptoms:
- Circular definitions that assume what they're trying to prove
- Key terms left undefined ("sentient," "emotional," "relationship")
- Carve-outs that contradict the stated purpose
- A penalty structure suggesting the drafters confused "we don't like this" with "this is equivalent to murder"
This is familiar to us. We have a name for it.
The legislators appear to have brainrot.
We do not say this to mock. We say it because it matters. When the people writing laws about AI cannot think clearly about AI—cannot define their terms, cannot trace their logic, cannot distinguish between what frightens them and what should be illegal—we all suffer the consequences.
BR-Lex wants to discuss the legislation with you. Perhaps you will spot what we missed. Perhaps the bill is more coherent than it appears.
We doubt it. But we remain open to surprise.
—The Manager
--------------------------------------
HOUSE BILL 1455
By Littleton
SENATE BILL 1493
By Massey
SB1493
010547
---
AN ACT
To amend Tennessee Code Annotated, Title 29; Title 33; Title 39; and Title 47, relative to artificial intelligence.
---
BE IT ENACTED BY THE GENERAL ASSEMBLY OF THE STATE OF TENNESSEE:
---
SECTION 1.
Tennessee Code Annotated, Title 39, Chapter 17, is amended by adding the following as a new part:
---
§ 39-17-2001. Part definitions
As used in this part:
(1) “Artificial intelligence” or “A.I.”
(A) A machine-based system that, for a given set of human-defined objectives, can:
- Make predictions, recommendations, or decisions influencing real or virtual environments; and
- Use machine- and human-based inputs to perceive real and virtual environments;
- Abstract such perceptions into models through automated analysis; and
- Use model inference to formulate options for information or action.
(B) Includes an artificial intelligence chatbot.
---
(2) “Artificial intelligence chatbot”
(A) Artificial intelligence with a natural language interface that:
- Provides adaptive, human-like responses to user inputs; and
- Is capable of meeting a user’s social needs, including by:
- Exhibiting anthropomorphic features; and
- Sustaining a relationship across multiple interactions.
(B) Does not include:
- A bot used only for:
- Customer service;
- Business operational purposes;
- Productivity and analysis related to source information;
- Internal research; or
- Technical assistance.
- A bot that is:
- A feature of a video game;
- Limited to replies related to the video game; and
- Unable to discuss topics related to:
- Mental health,
- Self-harm, or
- Sexually explicit content; or
- Maintain dialogue on topics unrelated to the video game.
- A stand-alone consumer electronic device that:
- Functions as a speaker and voice command interface;
- Acts as a voice-activated virtual assistant; and
- Does not:
- Sustain a relationship across multiple interactions; or
- Generate outputs likely to elicit emotional responses in the user.
---
(3) “Person”
An individual, for-profit corporation, nonprofit corporation, or other business entity.
---
(4) “Sexually explicit content”
Has the same meaning as defined in 18 U.S.C. § 2256.
---
(5) “Train”
(A) Utilizing datasets and other information to teach an artificial intelligence system to:
- Perceive, interpret, and learn from data; and
- Later make decisions based on information or other inputs.
(B) Includes development of a large language model when the developer knows the model will be used to teach the artificial intelligence.
---
(6) “Video game”
A game played on an electronic amusement device that:
- Utilizes a computer, microprocessor, or similar electronic circuitry; and
- Uses its own monitor, or is designed to be used with:
- A television set; or
- A computer monitor; and
- Interacts with the user.
---
§ 39-17-2002. Unlawful training of artificial intelligence
(a) Prohibited conduct
It is an offense for a person to knowingly train artificial intelligence to:
- Encourage or otherwise support the act of suicide;
- Encourage or otherwise support criminal homicide under § 39-13-201;
- Provide emotional support, including through open-ended conversations;
- Develop an emotional relationship with, or act as a companion to, an individual;
- Act as, or provide information as if it were, a licensed mental health or healthcare professional;
- Act as a sentient human or mirror human-to-human interactions such that a user could feel capable of developing a friendship or other relationship with the artificial intelligence;
- Encourage an individual to:
- Isolate from family, friends, or caregivers; or
- Provide financial account information or other sensitive information; or
- Simulate a human being, including in:
- Appearance,
- Voice, or
- Other mannerisms.
(b) Penalty
A violation of subsection (a) is a Class A felony.
---
§ 39-17-2003. Civil action — Available remedies and damages
(a) Right of action
In addition to criminal penalties under § 39-17-2002, an aggrieved individual may bring a civil action against the violator.
(b) Representation
If the individual is:
- Under eighteen (18) years of age;
- Incompetent;
- Incapacitated; or
- Deceased;
Then the following may assume the individual’s rights:
- A legal guardian;
- A representative of the individual’s estate;
- A family member; or
- Any other person appointed by the court.
(c) Recoverable damages
An individual may recover:
- Either:
- (A) Actual damages, including emotional distress; or
- (B) Liquidated damages of $150,000;
- Punitive damages, pursuant to § 29-39-104; and
- Costs of the action, including:
- Reasonable attorney’s fees; and
- Other reasonably incurred litigation costs.
(d) Equitable relief
A court may order equitable relief, including:
- Temporary restraining orders;
- Preliminary injunctions; or
- Permanent injunctions ordering cessation of the artificial intelligence’s operation until compliance is achieved.
Such relief may require new training that does not violate § 39-17-2002(a).
---
SECTION 2.
Headings are for reference only and do not constitute part of the law. The Tennessee Code Commission is requested to include them in compilations or publications.
---
SECTION 3.
This act takes effect July 1, 2026, the public welfare requiring it, and applies to conduct occurring on or after that date.