The US regulatory environment is a patchwork of laws and regulations that were written in response to various incidents, and with little forethought for the regulatory environment as a whole. It’s what allows you as a developer to say, “Well, that depends …” in response to almost any question, to buy yourself time to research the details.
In this article we’ll review six federal regulations that are most commonly relevant to software teams, and then briefly cover the many state-specific regulations that can go further than federal law. We’ll conclude with a discussion on how developers can work these regulations into their regular practice, and keep abreast of changes.
Likely the most well-known US privacy regulation, HIPAA covers almost none of the things that most people commonly think it does. We'll start with the name: Most think it's HIPPA, for Health Information Privacy Protection Act. It actually stands for Healthcare Insurance Portability and Accountability Act, because most of the law has nothing to do with privacy.
It is very much worth noting that HIPAA only applies to health plans, health care clearinghouses, and those health care providers that transmit health information electronically in connection with certain administrative or financial transactions where health plan claims are submitted electronically. It also applies to contractors and subcontractors of the above.
That means most of the time when people publicly refuse to comment on someone's health status because of HIPAA (like, in a sports context or something), it's nonsense. They're not required to disclose it, but it's almost certainly not HIPAA that's preventing them from doing so.
What is relevant to us as developers is the HIPAA Privacy Rule. The HIPAA privacy rule claims to "give patients more control over their health information, set boundaries on the use of their health records, establish appropriate safeguards for the privacy of their information."
What it does in practice is require that you have to sign a HIPAA disclosure form for absolutely every medical interaction you have (and note, unlike GDPR, that they do not have to let you say "no"). Organizations are required to keep detailed compliance policies around how your information is stored and accessed. While the latter is undoubtedly a good thing, it does not rise to the level of reverence indicated by its stated goals.
What you as a developer need to know about HIPAA is you need to have very specific policies (think SOC II) around data access, operate using the principle of least privileged access (only allow those who need to see PHI to be able to access it), and specific security policies related to the physical facility where the data is stored.
HIPAA’s bottom line is that you must protect Protected Health Information (PHI), which covers both basic forms of personally identifiable information (PII) such as name, email, address, etc., as well as any health conditions those people might have. This seems like a no-brainer, but it can get tricky when you get to things like disease- or medicine-specific marketing (if you’re sending an email to someone’s personal email address on a non-HIPAA-compliant server about a prostate cancer drug, are you disclosing their illness? Ask your lawyer!).
There are also pretty stringent requirements related to breach notifications (largely true of a lot of the compliance audits as well). These are not things you want to sweep under the rug. While it’s true that HIPAA does not see many enforcement acts around the privacy aspects as some of the other, jazzier regulations. But health organizations also tend to err on the side of caution and use HIPAA-certified hosting and tech stacks, as any medical provider will be sure to complain about to you if you ask them how they enjoy their Electronic Medical Records system.
Section 230 of the Communications Decency Act
Also known as the legal underpinnings of the modern internet, Section 230 provides that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
In practice, this means that platforms that publish user-generated content (UGC) will not be treated as the "publisher," in the legal sense, of that content for the purposes of liability for libel, etc. This does not mean they are immune from copyright or other criminal liabilities, but does provide a large measure of leeway in offering UGC to the masses.
It's also important to note the title of the section, "Protection for private blocking and screening of offensive material." That's because Section 230 explicitly allows for moderation of private services without exposing the provider any liability for failing to do so in some instances. Consider a social media site that bans Nazi content; if that site lets a few bad posts go through, it does not mean they are on the hook for those posts, at least legally speaking. Probably a good idea to fix the errors lest they be found guilty in the court of public opinion, though.
The Children's Online Privacy Protection Rule (COPPA, and yes, it’s infuriating that the acronym doesn’t match the name) is one of the few regulations with teeth, largely because it is hyperfocused on children, an area of lawmaking where overreaction is somewhat common.
COPPA provides for a number of (now) common-sense rules governing digital interactions that companies can have with children under 13 years old. Information can only be collected with:
Explicit parental consent.
Separate privacy policies must be drafted and posted for data about those under 13.
A reasonable means for parents to review their children's data.
Establish and maintain procedures for protecting that data, including around sharing that data.
Limits on retention of that data.
Prohibiting companies from asking for more data than is necessary to provide the service in question.
Sound weirdly familiar, like GDPR? Sure does. Wondering why only children in the US are afforded such protections? Us too!
The Family Educational Rights Protection Act is sort of like HIPAA, but for education. Basically, it states that the parents of a child have a right to the information collected about their child by the school, and to have a say in the release of said information (within reason; they can't squash a subpoena or anything). When the child reaches 18, those rights transfer to the student. Most of FERPA comes down to the same policy generation we saw in HIPAA, though the disclosure bit is far more protective (again, because it's dealing with children).
The Federal Trade Commission Act of 1914 is actually the law that created the Federal Trade Commission, and the source of its power. You can think of the FTC as a quasi-consumer protection agency, because it can (and, depending on the political party in the presidency, will) go over companies for what aren't even really violations of law so much as they are deemed "unfair." The FTC Act empowers the commission to prevent unfair competition, as well as protect consumers from unfair/deceptive ads (though in practice, this has been watered down considerably by the courts).
Nevertheless, of late the FTC has been on a roll, specifically targeting digital practices. An excellent recent example was the settlement by Epic Games, makers of Fortnite. The FTC sued over a number of allegations, including violations of COPPA, but it also explicitly called out the company for using dark patterns to trick players into making purchases. The company’s practice of saving any credit cards used (and then making that card available to the kids playing), confusing purchasing prompts and misleading offers were specifically mentioned in the complaint.
Quite possibly the most useless technology law on the books, CAN-SPAM (Controlling the Assault of Non-Solicited Pornography And Marketing Act) clearly put more time into the acronym than the legislation. The important takeaways are that emails need:
To disclose themselves as an ad
A physical address for the company
And as your spam box will tell you, it solved the problem forever.
CCPA and Its Ilk
The California Consumer Privacy Act covers, as its name suggests, California residents in their dealings with technology companies. Loosely based on the GDPR, CCPA requires that businesses disclose what information they have about you and what they do with it. It covers items such as name, social security number, email address, records of products purchased, internet browsing history, geolocation data, fingerprints, and inferences from other personal information that could create a profile about your preferences and characteristics.
It is not as wide-reaching or thorough as GDPR, but it’s better than the (nonexistent) national privacy law.
The CCPA applies to companies with gross revenues totaling more than $25 million, businesses with information about more than 50K California residents, or businesses who derive at least 50% of their annual revenue from selling California residents’ data. There are similar measures that have already been made law in Connecticut, Virginia, Colorado, and Utah, as well as other states also considering relevant bills.
Other state regulations
The joy of the United States’ federalist system is that state laws can be different (and sometimes more stringent!) than federal law, as we see with CCPA. It would behoove you to do a little digging into the state regulations when you’re working with specific areas — e.g., background checks, where the laws differ from state to state, as even though you’re not based there, you may be subject to its jurisdiction.
There are two different approaches companies can take to dealing with state regulations: Either treat everyone under the strictest regulatory approach (e.g., treat every user like they’re from California) or make specific carve-outs based on the state of residence claimed by the user.
It is not uncommon, for example, to have three or four different disclosures or agreements for background checks ready to show a user based on what state they reside in. The specific approach you choose will vary greatly depending on the type of business, the information being collected, and the relevant state laws.
How on earth can one be expected to keep up with all this?
It all seems a little daunting, no? And we literally didn’t even cover the Payment Card Industry Data Security Standard (PCI DSS), a truly daunting list of requirements for those who are touching payment information (easy answer: use a third party!).
But you eat the proverbial regulatory elephant the same way you do any other large food item: one bite at a time. In the same way you didn’t become an overnight expert in securing your web applications against cross-site scripting attacks or properly manage your memory overhead, becoming a developer who’s well-versed in regulatory environments is a gradual process.
Now that you know about some of the rules that may apply to you, you know what to keep an eye out for. You know potential areas to research when new projects are pitched or started, and you know where to ask questions. You know to both talk to and listen to your company’s legal team when they start droning on about legalistic terms.
And, of course, you know to keep an eye on the 8th Light blog to find out the latest information on how to be a better developer. I’d say you’re pretty well prepared.