- II. Low tech computer
- A. Criminal on work furlough--got a job in
accounts payable of city govt.
- 1. Duplicate vouchers, changed addresses
- 2. Cashed and converted into gold coins etc.
- 3. Eventually someone spotted the duplicates.
- 4.Trying for sixteen million,caught a little past one.
- B. Sabotage out of boredom story.
- 1. Employee in charge of sitting in a room all night
with computer equipment.
- 2. The disk drive kept breaking down, requiring people
to come in and repair it.
- 3. Eventually the employee was caught deliberately
damaging the drive. He was bored and wanted company.
- III. Card
- A. A sufficiently good blackjack player
can, on average, make money--but it requires keeping track of
what cards have been played, identifying situations where the
odds favor the player, and making large bets in those
- B. Which is hard for a human being but easy
for a computer.
- C. So, back before really small computers
were on the market, someone put together a system involving toe
switches and transmitters, linked to a computer.
- 1. One person watched the game, transmitted the
information back to the computer.
- 2. Another played, betting on instructions from the
computer (speaker in his ear? I don't remember the details).
- 3. Using four teams and a total of eleven people, they
made $130,000 in 22 days, before they were spotted.
- 4. Their systems were seized, but the FBI reported that
they weren't cheating--just using a computer, and there was
no grounds for prosecution.
- 5. On the other hand--by their calculations, the project
cost $10,000 in materials and $390,000 in labor, making it a
- 6. Unless you treat the labor as recreation, which I
suspect it was.
- IV. How much computer
crime is there?
- 1. Nobody really knows, since a large and
unknown fraction doesn't get reported.
- 2. There have been tabulations of reported
computer crimes, from which it appears ...
- 3. The rate was rising until about 1973,
then flattened out--but the data only go to 1980.
- 4. Financial crime was most common, crime
involving government next, crime involving students
- I: The old nightmare:
Computers as the end of privacy.
- A. Two forks: Public (government
information about you) and private (credit bureaus
- B. Privacy (should there be limitations on
the dissemination of true information? Information that is
probably true but not provable?)
- 1. Is privacy more or less of a problem now than in the
- a. Much less--consider small town vs city. The
computer only partly reverses this ...
- b. The modern credit agency knows less about you than
your neighbors would have known.
- c. Does freedom and autonomy include a right to
commit fraud? What about "social fraud"--being friendly
with people who would not be if they knew the truth about
- d. Is having information with some errors more unfair
than if the information was not there at all, so that
people would be treated the same independent of the truth
- e. Does a right not to have misleading information in
your file improve the accuracy of the reporting system?
There may be useful information that they cannot prove is
- 2. If there is a less privacy, is that a bad thing?
- a. Better information technology leads to a decrease
- b. But not necessarily to an increase in the risk of
being damaged by false information about oneself. Better
technology may make it easier to distinguish true
information from false. Computerised credit reports are
likely to be much more reliable than gossip.
- c. Does an improvement in information technology lead
to a shift in power? If so, is it from those who do not
have information to those who do, or from bad guys to
their potential victims? If an employer can find out
which actual and potential employees are good and which
bad, does that mean that the employer has more power over
workers, or that good workers have increased power to get
rewarded for working well and bad workers less power to
get rewarded for working badly? Similarly, does improved
credit information mean a transfer of power from
borrowers to lenders, or from deadbeats to people who pay
- d. There are at least two potential ways of dealing
with privacy problems. One is to restrict the gathering,
distribution and use of information about people. The
other is to permit people to do things, such as
encrypting their communications, that keep the
information from ever getting out.
- C. Can one effectively control the use of
credit information--even if the law says you can? How can you
have it accessible enough to be useful and still enforce rules
saying that only those with a legitimate use can get it?
- II. Public
- A. Merriken v.
Cressman: A school drug prevention
- 1. The proposal:
- a. Collect lots of personal information about kids
- b. Without getting informed consent from parents
- c. And use it to figure out which ones are at risk of
drug use, in order to
- d. Take preventive action.
- 2. Arguments against it:
- a. If you decide a kid is likely to use drugs, that
may be a self-fulfilling prophecy or lead to scapegoating
by other children.
- b. Gathering the information may be a violation of
family privacy and the child's loyalty
- c. "Preventive action" means incompetent
psychotherapy by amateurs
- d. Inadequate precautions to keep the information
- 3. Constitutional issues:
- a. Privacy, freedom of speech, etc. Analogous to
Griswold and Roe v Wade, since parent/child
relationship, like husband/wife, is highly private, hence
- b. There was no provision for informed consent to
waiver of rights (even supposing they are waivable). The
"information" the parents were provided with was
essentially advertising copy.
- c. A balancing test is appropriate, but goes heavily
against permitting the program.
- 4. Relevance to us: Collecting information may be an
unconstitutional violations of privacy if it involves
sufficiently intimate subjects.
- B. Robert P.
Whalen v. Richard Roe. Can NY state
maintain a file of names and addresses of those who have gotten
a prescription for controlled substances?
- 1. Precautions by state
- a. barbed wire, locks, protect the paper forms,
destroyed after five years.
- b. Computer tapes in a locked cabinet
- c. Tapes are run off line.
- d. 17 people have access, 24 more use the data to
identify cases of overdispensing.
- e. Public disclosure of data prohibited by statute.
- 2. After 20 months, data had been used in two
- 3. District court enjoined enforcement of that part of
the statute as a needlessly broad infringement on the
privacy of patients.
- a. Intrudes on doctor-patient relationship--another
zone of privacy.
- b. Might cause patients not to use medically
indicated drugs for fear of becoming stigmatized as drug
- 4. Supreme Court:
- a. Legislation that has some effect on liberty or
privacy need only be a reasonable attempt to achieve a
legitimate state goal; it cannot be enjoined just because
the court thinks it is unnecessary. Rejects
- b. Collection of private data poses a privacy risk,
but not unconstitutional if reasonable precautions
against unwarranted disclosure are taken.
- c. Query--could J.Edgar Hoover have gotten access to
- III. Rogan v City of Los Angeles
- A. Against a municipality, must show
- 1. deprivation of protected interest
- 2. due to an official policy etc.
- B. Erroneous information, identifying Rogan
as a wanted murder suspect, was put into the National Crime
Information Center computer system. Suspect (escapee) had
gotten a copy of Rogan's birth certificate.
- 1. Readily available information that would avoid
confusion was not included. Real suspect's physical
- 2. There was also a bulletin, more narrowly distributed,
with that information, which did not lead to any arrests.
- 3. Rogan was arrested in a trespassing dispute:
- a. held 5 days,
- b.the police checked with Los Angeles, were told they
had the wrong man, released him.
- c. Essentially the same thing happened five times in
five different incidents.
- 4. When he was arrested, the NCIC record was deleted,
- 5. The information was repeatedly reentered, as per
policy, with no checking.
- 6. He was in a car that was stopped for not using a turn
signal, gave ID, taken from car at gunpoint, handcuffed to
bars of jail, released in two hours after they checked with
- 7. Happened three more times.
- 8. Until Michigan informed the L.A. police that the
person actually wanted for the offense was in jail in
- C. Plaintiff deprived of rights
- 1. NCIC record is the functional equivalent of an arrest
warrent and violated fourth amendment particular description
- 2. Maintanance and reentry caused further arrests
without due process of law.
- 3. Plaintiff's life was endangered at each arrest, since
officers thought they were dealing with a suspected
- D. By policy of L.A.?
- 1. Police officers were not trained in how to amend
information in the system or the need to do so.
- 2. They did not even know it was possible to do so, nor
did they consider doing so after initial misidentification
- 3. Officer Crotsley had a policy, inconvenient to
victim, for dealing with such situations. They had to come
to L.A. and get a printout of the warrent with his business
card--but record still there.
- 4. NCIC pollicy says errors are to be corrected when
- E. Result
- 1. L.A. is liable. "Gross negligence."
- 2. Officers are not because of qualified immunity.
- F. Consider the scaling problem resulting
from a national "wanted" data bank. Almost every common name in
the U.S. might be wanted somewhere for something, giving police
a legal right to arrest and search almost anyone without
- G. This again, as in the hacker crackdown
cases, raises the issue of police imposing costs on innocents
parties in the process of trying to enforce the law--and not
taking precautions to minimize them.
- IV. Private fork:
Thompson v San Antonio Retail Merchants
- A. Automatic capture of
information--strengths and weaknesses.
- 1. Any member of SARMA could decide that someone he had
information on was the same person as someone SARMA's data
base had information on, at which point the information was
combined and attributed to a single person.
- 2. Cheap and easy way of adding information, but ...
- 3. Individual merchant may be careless, since he does
not pay costs of error.
- 4. The system did in fact misidentify one William D.
Thompson (bad debt) with another and
- 4. Wards denied the latter credit.
- 5. He thought it was because of a recent past felony
conviction for burglary (probation)
- 6. Took a lot of trouble and a court suit to get them to
fix their records
- B. Is SARMA testifying to facts or
- 1. Suppose the information was added to the data base
with a note of its source? Can SARMA then shift the blame
and the liability to the merchant who reported the
- 2. Is it libel to report another's libel? Yes, often. So
shifting the liability will not work--they will both be
- C. FCRA imposes duty of reasonable
care--which was not met here.
- D. Damages. $10,000+costs
- 1. Humiliation and and mental distress, because ...
- 2. He was falsely suspected of reneging on a $77 debt,
- 3. He was in fact only a convicted felon!
- 4. One suspects punitive etc. motives in the court.
- E. A centralized system with decentralized,
unchecked, intormation entry. Definitely poor design.
- V. Fair Credit
- A. ¶609 Disclosures
- 1. Credit agency must provide subject with all
nonmedical information it has on him,
- 2. Provide him the sources except for investigative
consumer reports, and
- 3. Tell him who the recipients of the information are.
- 4. The act immunizes credit bureaus against defamation
suits and the like, except for violating specific provisions
or acting with malice.
- 5. If they violate terms deliberately, punitive
+attorney's fees possible, if negligent, only compensatory +
- 6. Criminal penalty for knowingly giving out to
unauthorised persons, or knowingly obtaining in violation of
- B. ¶611 Procedure for disputing and
recording disputes, and correction.
- C. ¶613 Public Record Information for
- 1. Information goes to court or grand jury, or anyone
the subject wants it to go to, or to anyone with a
legitimate business purpose in connection with a transaction
involving that subject.
- 2. Legal rules on when information becomes obsolete.
- a. Why? Bankruptcy 10 yrs. Judgements 7 years or
statute of limitations. Arrests, indictment, or
- b. Only applies to small transactions.
- VI. Digression: How should we
handle the private data base issue?
- A. Regulation on what they can do, how they
can be set up?
- B. Slander or Defamation law?
- C. Liability for false information?
- D. Laissez-Faire plus reputation.
- VII. Privacy: How can it be
- A. Against legal data bases
- 1. Is privacy a good thing? Why?
- 2. What about freedom of contract:if they see it it's
- 3. Current rules more restrictive, but ...
- 4. How well enforced?
- 5. What would a world of less privacy be like? Web may
give it through voluntary choices.
- a. By setting up a web page that covers many of my
interests, I linked my multiple lives together.
- b. Someone who is thinking of hiring me as a
professor can discover that I spend time and energy
researching medieval recipes--which the potential
employer might consider a frivolous waste of time.
- c. Someone interested in my medieval cooking can
discover that I am an extreme libertarian--a position he
may consider offensive.
- d. On the other hand...people who are interested in
one part of my life may find the other parts of interest,
- e. Setting up different pages, not linked, with no
easy way of finding one from the others, would have been
- B. Against data spying: Encryption, if
- C. Against data interception: Legal limits
on phone taps and the like, plus,encryption if
On to Encryption
- 0. Digression:
- A. Fortune Article, on reserve, refers to
the case as if he had been trying to steal information from
Intel--which there is no evidence he was doing.
- B. It looks at least as much like abuse of
the criminal process by Intel.
- C. Gets us back to my hypothetical "crime"
against University of Chicago Law School.
- D. Lots of information about the case is
available on the web.
- I. The question of
- A. Arises twice in this discussion
- 1. One reason the government chose DSS not RSA was
apparently because RSA can also be used for encryption. By
encouraging its use for one purpose, you make it easier for
it to become a standard and be used for the other purpose.
If I already have your public key in order to verify your
messages, why not encrypt messages to you using it?
- 2. The government may plan to "enforce" Clipper by
making it a standard, rather than by making it mandatory.
Since you have to use it for secure conversations with the
government, you might as well use it for other conversations
- B. What is a standard?
- C. How stable is a suboptimal
- 1. Examples:
- a. English spelling seems to be very stable.
- b. It used to be claimed that the Wankel ("rotary")
engine was much better than the reciprocating engine
(what we use in all cars except the Mazda RX7), but was
not used because everyone had sunk money into optimizing
the reciprocating. The evidence against is that NSU and
Mazda both put a lot of effort into producing a
commercially viable rotary engine, without much success.
- c. It is claimed that the Qwerty keyboard is much
inferior to the Dvorak keyboard, but remains dominant
because it became a standard, more or less by accident,
in the 19th century. For the evidence that that story is
almost entirely mythical, see the article "The Fable of
the Keys" in Journal of Law and Economics some years ago.
- d. b and c suggest the difficulty of judging such
claims--given that the "superior" alternative is not in
use to be evaluated.
- 2. How might an inefficient standard get established?
- 3. How high are the costs of switching to the superior
- a. In the QWERTY/Dvorak case, the costs to some users
(typists using many machines) were high, but to others
(authors using their own machines, a firm that trained
and retained its own typists) would be low. The fact that
those inthe latter class did not switch over is evidence
against the claim that Dvorak was much superior.
- D. Standards also appear as an issue in
discussions of intellectual property in computer law:
- 1. The claim that something is a standard is usually
assumed to be an argument against protecting it, since we want
compatibility, but ...
- 2. One might get compatibility by having a standard owned
and licensed out.
- 3. The fact that something is a standard might be an
argument for protection; if we want good standards, we should
give their inventors protection so they can make money by
inventing them. One can argue that the superiority of the Mac
interface is the result of the fact that Apple, through its
copyrights on the Mac ROMs, had de facto ownership of the
interface, thus an incentive to design it well and keep
- E. Standards and the recent history of
economics--a policy perspective.
- 1. In the beginning: Smith, Ricardo, Marshall,
- 2. Keynesian/market failure counter revolution. Dirigiste
- 3. Monetarist/public choice counter counter revolution.
- 4. neo (not very) Keynesian counter-counter-counter
- a. Standards, network externalities, as part of that, since
- b. They imply that historical accident is important, you
might end up with a badly suboptimal but stable standard
(Qwerty--if you believe the orthodox account).
- c. And could prevent that by wise intervention at the
beginning, or ...
- d. Get out of it by government coordination afterwards.
- e. Argument is not about whether this is possible but about
how important it is--Liebowitz Margolis critique reverses the
argument, says the fact that Qwerty was stable means it was
- A. You should all have topics by now,
working on them. Try web browsing as a first step.
- B. Texas law:
- 1. Requires every ISP to provide information about some
- 2. Is it unconstitutional?
- 3. Is it sensible?
- C. Try thinking as the other side:
- 1. As the criminal when imagining future computer
- 2. As the regulator, when imagining future restraints on
- 3. Puzzle for Thursday: If you wanted to stop strong
privacy at minimum cost in opportunities foregone, how would
you do it?
- D. What are you trying to get?
- 1. A secure conversation between two strangers?
Diffie-Hellman key exchange.
- 2. But that is not secure against a man in the middle
- 3. A message that can only be read by a known
recipient--as opposed to by the unknown person you are
- 4. And verifiable and non-repudiatable signatures.
- 5. 3 and 4 require a public key system.
- III. Digression on details:
Digital signatures et. al.
- A. How a hashing function works.
- 1. Take a text, compute from it a long number much
shorter than the text ( "hash" the text).
- 2. In such a way that it is easy for someone else to do
the same thing and make sure his number matches yours, but
- 3. It is extremely difficult for someone to start with
the long number and generate a text that will hash to the
- B. How a digital signature works to sign
and guarantee integrity.
- 1. You hash the text.
- 2. Then you encrypt the hash with your private key.
- 3. You send the text (unencrypted) along with the
- 4. The recipient decrypts the hash with your public key,
hashes the text, and checks that the two numbers match.
- 5. If they do, then you sent the message (because only
you have your private key) and it has not been altered since
you signed it (because it it had been it would hash to a
- 6. The whole message, text + encrypted hash, could in
addition be encrypted with the recipient's public key if you
wanted secrecy as well as a signature.
- C. How digital time stamping works.
- 1. You have some information that you want to keep
secret, but be able to prove later that you had now.
- 2. You write it up, hash the text, and publish the hash
as a classified ad in the New York Times or equivalent.
- 3. Three years later, when there is a dispute as to how
early you came up with the invention you are now trying to
patent (for example), you produce the original text and show
that it hashes to what you published three years ago.
- IV. As computers get faster,
they can encrypt faster, decrypt faster, and break encryption
- A. So you use longer and longer keys--which
take longer to encrypt and decrypt but are harder to
- B. It appears to be the case that the time
it takes to break the encryption increases more, as the key
gets longer, than the time to encrypt or decrypt. If so,
encryption can get stronger as computers get faster.
- V. Froomkin on the clipper chip:
- A. The Clipper proposal.
- 1. Government designs and certifies a chip to encrypt
- 2. Each chip has a serial number and a chip key (single
key system), copies held by escrow agents. Secret Skipjack
single key algorithm
- 3. Two phones somehow negotiate a session key.
- a. For example, one phone generates key pair, sends
the other its public key, other phone generates session
key, encrypts with public key, sends back. Then erase key
- b. Works except against a man in the middle attack.
- 4. They send the session key encrypted with the chip
key, all that + chip serial number encrypted with the family
key (the same for all chips) in the LEAF. Then talk.
- 5. Law enforcement agent records the conversation,
decrypts to layer of LEAF to get the serial number, goes to
escrow agents to get the chip key, decrypts the encrypted
session key, decrypts the conversation.
- B. Technical problems.
- 1. There might be an intentional or unintentional back
door, since Skipjack is classified.
- a. Why is the algorithm classified? Doing it that way
makes it harder to convince people, harder to make sure
it is sound, and makes it harder to build the chip, since
it must be resistant to reverse engineering.
- b. Because they know there is a back door?
- c. Because they want to make it hard for people to
figure out clever ways of cracking it?
- d. Because they want to make it hard for people to
build bogus clipper chips?
- e. Because it embodies encryption ideas that they
want to keep secret for other purposes.
- f. But note that other people have done the
equivalent without those secret ideas, so why use them
- 2. No protection against pre-encryption (slow--in 1995),
- 3. People just using their own system--not mandatory.
- 4. Or someone with family key but without a court order
using LEAF for traffic analysis--doesn't require phone
company cooperation (like a pen register).
- 5. Leafblower allows one phone, with luck, to pretend to
the other that it is sending a legitimate LEAF when it
isn't. Slow--now. But computers are getting faster.2^16
- 6. Once the key is out, must notify owner.
- 7. To hear both sides only need one key with phone, but
what about EMail?
- 8. Since it isn't in real time, you don't negotiate a
- a. Need both parties' keys--and end up with the key
of everyone your subject has sent mail to. or
- b. Modify the protocol, require the first
transmission to include a session key to be used between
those two people (actually those two clipper chips)
- c. And included, encrypted with the chip's key, in
every transmission in either direction.
- C. Market problems:
- 1. How do you get foreigners to use a system designed
for U.S. law enforcement to tap?
- 2. If foreign law enforcement can tap it too, security
- 3. Local clippers? Interoperable? How do you make sure
that only phones the French government has codes for are
used in France?
- 4. How do you get anyone to use it when they are locked
in, hardware, can't verify, may not trust law enforcement's
honesty or competence?
- 5. By making it the standard, thus lowering the cost of
having a "secure" conversation without elaborate
pre-arrangements. Why do criminals use phones now, instead
of meeting in secure locations? Are there any secure
- 6. A different answer is that the government has
anticipated strong privacy and is trying to prevent it. If
Clipper encryption is the only encryption available to the
general public, then firms selling pirated software,
assassination, etc. will have no safe way of communicating
with the mass market.
- D. Administrative vulnerability:
- 1. All of the "guarantees" are unenforceable promises.
- 2. The president could, without announcing it, change
the rules on access to let (say) the FBI get at the archives
without notice, court order, etc.
- 3. Note that all of this could be changed, by setting
the system up via legislation instead.
- a. Why wasn't it done that way?
- b. Because Reagan and Bush didn't think they could
get it through?
- c. Or because they (and Clinton) wanted to keep their
- E. Standards setting as an end run?
- 1. Using standard setting as an alternative to
legislation that does not require congress to agree.
- 2. Which did not work for Clipper (or for DSS).
- 3. Which might mean their theory is wrong.
- 4. Or they did it wrong--the standard problem was
actually still unresolved, since the machines had to
negotiate a session key.
- 5. the enhanced chip was supposed to take care of that.
- 6. Note that software reduces conversion costs, thus
strength of standards.
- a. Dvorak easy now by key remapping.
- b. BE/MacOS situation: The same hardware runs both
- c. Can have one set of hardware, four different
encryption standards. Eudora does--for attached
- d. Consider the possibilities for multiple
currencies-. My web page lists prices in dollars. Your
browser reports them to you in pounds--after checking
with you bank's web page to get this minute's conversion
rate. You buy the goods in pounds, I receive dollars,
your bank does the conversion.
- e. Something like this already happens with Visa,
money cards, etc. I can put my card into a money machine
in Canada, get Canadian dollars out, but end up with a
debit of my account in U.S. dollars.
- F. Was the process the produced the Clipper
ship illegal, given that NIST was supposed to be in charge, and
NSA seems to have called the shots? Unclear.
- G. Is judicial key escrow
- 1. It has the advantage of putting it under the control
of someone independent of the executive, but ...
- 2. May not fit into the constitutional powers of the
- H. Ratchet problem with law enforcement
- 1. Some technological changes increase the power of law
enforcement vis a vis the rest of us, some decrease it.
- 2. If every time a change decreases the power, the law
is changed to cancel the effect--for example, the digital
wiretap bill, to preserve law enforcement's ability to tap
phones, or the Clipper chip, ...
- 3. Then on average the power of law enforcement will
- 4. For instance, voice recognition software will soon
make it much cheaper to tap phones, since a computer can
"listen" to the line, record the message, and pass it on to
a human if it contains the appropriate key words or phrases.
- I. Is mandatory escrow
- 1. First amendment (free speech)
- 2. Second amendment (not discussed by Froomkin)
- a. According to ITAR, encryption software is a
munition of war.
- b. According to the second amendment, we have the
right to bear arms, so ...
- c. Does mandatory escrow violate the second
- 3. Fourth amendment (search and seizure)
- 4. Fifth amendment (self-incrimination)
- VI. Controlling Encryption--today we are playing the government's role, and trying
to figure out how we would control it if we wanted to.
- A. Registry of digital signatures, require
ISP's to check that everything has a registered signature
before passing it on to the customer.
- 1. Check the mailspool before downloading
- 2. Check the News Server messages before making them
- 3. Check web page before ... .
- B. Problems:
- 1. Can only interact with other countries that have the
- 2. Makes the structure more rigid--everyone must filter
through a licensed ISP.
- 3. Raises processing costs, because of the checking it
- C. Government censoring of News and
- 1. ISP must block, cancel, on instructions from
- 2. Enormous amount of work.
- D. Forbid use of unescrowed keys, forbid
- 1. Random checking at many levels of the system, with
- 2. But the only way to check against pre-encryption is
by decrypting, reading?
- 3. Decrypt, have a secure computer "read" and only pass
- 4. Simply make it a crime, prevent public sale of the
software, entrap people, etc.
- 5. What about c ommunications with other countries?
- E. Costs of any such system:
- 1. Civil liberties costs--some of us don't trust our own
- 2. Use of networks becomes less attractive, because we
may fear leaks in the escrow process.
- 3. Costs of any checking, monitoring.
- 4. General loss of flexibility, through government
- 5. Problems in coordinating with foreigners.
- F. Benefits of such system:
- 1. Ordinary law enforcement.
- 2. IRS enforcement.
- 3. Prevent development of criminal enterprises with
brand name reputation.
- VII. Some possible
problems with Clipper
- A. To make sure the other device knows if
Clipper is being defeated, the Law Enforcement block contains a
checksum of the session key encrypted only with the family
- B. This only works if the family key is
secret! Otherwise a rogue Clipper chip (or equivalent in
software) can be built that gives the right checksum but gives
the wrong session key (which is encrypted with the chip's key,
which the other chip does not know and so cannot check
- C. To keep the family key secret, it is
built into a "Decrypt device" --a sort of reverse Clipper Chip.
Like the Clipper Chip, it is presumably protected against
reverse engineering. Nobody need know the family key--if you
want to use it to decrypt the LEAF of intercepted
communications, you check out a decrypt device.
- D. So the Law enforcement block is
secret--but how long until a decrypt device is stolen? That
still does not let you build rogue Clipper chips (since you
can't get the family key out of the device), but it does let
you do traffic analysis (who is talking to whom) without a
- E. One advantage to hardware is that while
it can be stolen, it cannot be (easily) copied. So if one
decrypt device is stolen, one criminal has it. If the family
key got out, pretty soon everyone would have it.
- VIII. Hardware v Software encryption--can we do the equivalent of Clipper in
- A. We need to use (tamper-proof) hardware
if we use a classified algorithm
- 1. since software could be disassembled to deduce the
- 2. But we don't need to use a classified
algorithm--there are unclassified ones that will do just
- B. Software can be modified to defeat the
- 1. But you can do the equivalent for hardware by
- 2. All you need to equal Clipper is a system that
requires modified programs on both ends to defeat the escrow
- 3. Or in other words one that, like Clipper (once they
fix it), prevents rogue communicators.
- C. Problem: If you use software, how do you
keep the Family Key secret?
- D. Solution--use a public Family key for
encryption, private for decryption (by law enforcement agent).
So only the public key is built into the program.
- E. The receiver can construct the LEAF and
check it is right! Since we no longer have keys associated with
chips, the LEAF constructed by either key is the same. We
encrypt the session key with the escrow agency's public key
(fancier version for multiple agencies) and include it in the
LEAF instead of escrowing chip keys with the escrow
- F. Law enforcement takes the LEAF to the
escrow agency, gets back session key.
- G. All of this is my somewhat simplified
version of what the article proposes.
- H. The basic idea is that you can use
public key encryption plus software to accomplish the same
objective that the clipper chip accomplishes with private key
encryption and hardware.
- IX. Beyond
- A. Capstone
- 1. DSA
- 2. Hashing algorithm: Explain.
- 3. Key exchange using public key exchange
- 4. Random number generator using a pure noise source
- B. Fair Cryptosystems.
- 0. Explain how the two key system works.
- 1. Decentralized, so each escrow agent can check
- 2. But centralized key distribution center, which knows
whether a public key has been certified.
- 3. Maybe do it with fewer than all pieces?
- 4. Time problem. Compute session key algorithmically
using private key.
- C. Mandatory key escrow with private escrow
- 1. Enforced legally--it is against the law to encrypt
with an unescrowed key, or ...
- 2. Enforced technologically.
- a. You buy the software, create the keys, take both
of them to an escrow agent appointed by the manufacturer
of the software.
- b. Who gives you a certificate, digitally signed by
him, which you need to operate the software with that key
- c. Might combine with a centralized key registry,
coordinated with the escrow agents.
- 3. One could use ITAR to get private escrow going by
allowing the export only of software that requires escrow...
The NIST proposal. But who licenses the escrow centers?
- D. Non-mandatory but useful key
- 1. Think about the reasons you would want your (firm's)
- 2. An employee who had the key protecting the records
could extort a large payment
- 3. Or a mistake by someone or some machine could
permanently cost you your records, since ...
- 4. You don't want a lot of copies of the key floating
around, for fear one will get out.
- 5. And most firm's are not specialized in security--that
is the problem encryption is supposed to solve, not create.
- 6. But this works mostly for records, not
- 7. The firm wants to make sure that if it really has to
it can read its employees' mail.
- 8. You end up with a system in which most records and
some transmissions can be read, but not those of people who
are doing seriously illegal things.
- 9. Sort of like Clipper?
- E. Centralized vs decentralized key
- 1. A centralized system ( "phone book")
- a. is very convenient, since it lets you find a
stranger's public key quickly and easily.
- b. And you don't need special information about each
- c. Compare a decentralized system, such as the
web--how do you judge the quality
- 2. A decentralized "web of trust," ala PGP, lets you
- a. decide who you trust
- b. Combine information from multiple sources
- c. But depends on some link between you and the
person you are communicating with.
- d. Is that a problem after a while? Consider an
automated version, where you are compiling your own phone
book by having your software query everyone whose public
key it has as to all the ones they have.
- e. A phone book with a billion names and 128 bit keys
takes only about 64 gigabytes; in ten years or so that
will be a standard hard drive.
- 3. An alternative is a certificate system, a la
- a. A web of trust which depends on a small number of
widely known experts.
- b. And bundles the key exchange with the
- X. International:
Dorothy Denning thought piece.
- A. Multiple standards--at least if we
- B. What do you escrow?
- 1. session key
- 2. Device/software copy key. Can try to control which
key is in which country.
- 3. User's private key. Escrow agents know who is under
surveilance. Also with chip if family key gets out.
- 4. Escrow at manufacture or first use (of that
key--allows revision) first use allows national escrow.
- 5. How do you enforce?
- a. Micali: Control the registry, can only register
escrowed keys. What about black market registries?
- b. Certificate of escrow, which must accompany the
transmission? Again, how to prevent black market.
- 6. Need both sides of the conversation. LEAF. EMail?
- 7. Export permission as bribe to include escrowed keys.
- 8. Who has access? Country that manufactured? Where
phone is? How do you handle international commerce?
- XI. Why does the
government care about cryptography?
- A. NSA.
- 1. Foreign govts might learn better codes, which would
be harder to crack.
- 2. they might realize that we can crack their codes--as
the German and Japanese governments did not realize in WWII.
- 3. Might learn to crack our codes.
- 4. But Russians are good mathematicians... . Letting
them know what we know as well as what they know might make
things a little easier for them, but it seems more likely
that secrecy is intended to keep information from smaller
countries and private citizens.
- B. FBI: Interested in being able to tap
phones. It tried to get such a bill in 1992, succeeded in
- 1. Digital Telephony may be mechanically harder to tap.
- 2. Cellular phone calls, especially on a system designed
to provide some privacy, may be harder to identify--you can
intercept a message, but how do you find the conversation
corresponding to a particular phone?
- 3. Encrypted phones are useful when the conversation is
over the air, but make wiretaps harder
- 4. All three of these can be dealt with by the
provider--but what if the user provides his own encryption?
- 5. Old Phone Phreak version of the problem. People who
knew the phone system well could route their calls by
indirect routes, making it very hard to trace them.
- 5. How much is the ability to tap phones worth?
- C. Have NSA, FBI etc. thought about strong
- XII. How important
is wiretapping? Freeh's
- A. "Terrorist crime will get worse
- 1. How many people have been killed so far by terrorists
in the U.S.?
- 2. How many such crimes could have been prevented by
wiretapping if, but only if, something like the digital
telephony bill was in force?
- a. Oklahoma City, if the current version of what
happened is true, could not have been--there was no large
scale conspiracy to tap.
- b. World Trade Center? Maybe--but you would have to
have a spy in the relevant circles to alert you to whom
to tap, and once you have the spy tapping might be
- c. We do not know the answer to the question, the FBI
probably does not know. How plausible is their implicit
opinion (that there are a substantial number of such
- B. played a role in convicting tens of thousands of
felons over a decade--out of how many? Data from the
Statistical Abstract: 1987: 10 million people arrested, 2
million plus for serious crimes. I don't have figures for
convictions for the whole country, but for U.S. District
courts alone it was about 44,000.
- Convictions based on information received from wiretaps:
- Offenses specified of which those people were convicted:
homicide and assault: 18. Drugs and gambling: 514
- C. "At least we can say that we
- 1. Imagine a case involving a stolen Russian nuclear
weapon used to blow up a U.S. city.
- 2. Everything hard happens outside the U.S., where our
encryption rules don't apply anyway.
- 3. All they have to do inside the U.S. is to ship a
package smaller than an automobile to the intended city and
detonate it--which should be trivial for any halfway
- D. Denning "I was exposed to cases where
wiretaps had actually stopped crimes in the making."
- 1. How does she know what would have happened without
- 2. Or that she is being told the truth?
- 3. And how do we know that she is telling the truth.
- 4. Given that the "evidence" is all secret, and the
people who control it have a strong interest in having us
believe what they want us to.
- XIII. Chapter
- A. DSS vs RSA
- B. Govt proposal:
- 1. Can only be used b y appropriate people
- a. connected to serial number of decryption device
- b. Another reason to keep the algorithm
secret--control production of decryption devices?
- 2. Time limited. How?
- 3. No substantive rights to suppress evidence obtained
in violation of these procedures!
- C. EPIC or our proposal?
- 1. Set voluntary standards.
- 2. Provide digital passports.
- 3. Certify voluntary escrow agencies.
- 4. ??? Anything else???
- XIV. Chapter 4: Policy dispute.
- A. Denning: There is no back door. She
reached this conclusion after how much time trying to find
- B. Denning argues that the algorithm is
classified to prevent unescrowed versions of Skipjack
appearing--but RSA is already out there, and whether or not it
is as good as Skipjack, it seems to be good enough.
- C. Gelernter: Solving or preventing large
number of ghastly crimes. 18 homicides and assaults (Stat
- D. Baker:
- 1. Clipper conservative, since it merely lets the
government continue to do what it is already doing.
- a. This is mostly true.
- b. It raises the question--do we want more privacy
from govt than we now have?
- c. Some would answer "yes."
- 2. Standardization to catch stupid criminals. But note
that with a little progress, alternative systems will be not
only available but cheap and easy.
- 3. Companies want escrow--then let them have it.
- 4. Only government can do it--what about PGP, RSA.
- 5. Export restrictions don't affect foreign govts.
- XV. Chapter 5: The digital telephony act
- A. Why is there a problem?
- 1. Digital phones are technically harder to tap.
- 2. Cell phones have good reasons to encrypt.
- 3. Private encryption is becoming practical.
- B. Bill provides that.
- 1. Carrier must make it possible to intercept and
isolate calls to or from particular subscriber.
- 2. Ditto for call identifying info
- 3. Get the stuff to govt agency
- 4. Cannot require phone company to provide decryption if
they don't have it.
- 5. If they call is handed off to someone else, tell govt
- 6. Govt will tell them capacity at some point. (flap)
- 7. Provides $500,000,000/year to pay for it.
- C. Freeh says "most important." Actually
phone taps are used almost entirely against drugs and gambling.
His side provides lots of anecdotes but little data.
- 1. What is the real cost/benefit calculation? Don't
- 2. How much capacity are they asking for?
- I. CDA arguments:
- A. Three parts:
- 1. (the transmission provision), imposes criminal
penalties on "[w]hoever * * * by means of a
telecommunications device knowingly * * * (i) makes,
creates, or solicits, and (ii) initiates the transmission
of, any * * * communication which is * * * indecent, knowing
that the recipient of the communication is under 18 years of
- 2.(the specific child provision) imposes criminal
penalties on any person who uses an "interactive computer
service" to "send to a specific person or persons under 18
years of age * * * any * * * communication that, in context,
depicts or describes, in terms patently offensive as
measured by contemporary community standards, sexual or
excretory activities or organs."
- 3. (the display provision) imposes criminal penalties on
persons who use an interactive computer service to "display"
patently offensive sexual material "in a manner available to
a person under 18 years of age." 47 U.S.C. 223(d)(1)(B).
- B. Defense: The CDA establishes a "defense
to * * * prosecution" for a person who "has restricted access
to such [indecent] communication by requiring use of a verified
credit card, debit account, adult access code, or adult
personal identification number," 47 U.S.C. 223(e)(5)(B). A
defense to prosecution is also available to those who have
"taken, in good faith, reasonable, effective, and appropriate
actions under the circumstances to restrict or prevent access
by minors" to their indecent communications.
- C. Objections by the district court:
- 1. It interferes with adult to adult transmission of
sexually explicit material.
- 2. Is subject to strict scrutiny and not narrowly
- 3. Because it is not practical for many users
(listserves, web pages that are not commercial porn sites)
to avoid making the information available to children.
- 4. "Community standards" is too vague--what is the
- D. DOJ reply:
- 1. Transmission is just as constitutional as existing
realspace restrictions. No first amendment right to send
indecent material to minors.
- 2. Display provision is analogous to time of day
(Pacifica) or place (zoning) restrictions, in order to keep
things from minors. Problem is greater here, since harder
- 3. Does not unduly restrict adults, since adult ID cards
etc. are practical.
- 4. Screening software cannot keep up, only a few people
have it. Limiting children to sites that affirmatively are
child suitable would cut out most of the web.
- 5. Anyway, provisions are severable.
- 6. Sable Communications of Cal., Inc. v. FCC
suggests that credit card etc. are reasonable alternatives
to a ban.
- 7. Knowledge requirement on the first two provisions
eliminates problem of not knowing who you are sending to.
- 8. Display provision is at least constitutional for
- E. But ...
- 1. How do you provide a useful ID card, when person A
can post his ID for all the world to borrow? Public key
cryptography? If he posts his private key, he loses
anonymity. But this only works once almost everyone is using
public key encryption.
- 2. Like reselling Playboy. But easier--the kid can do it
from the kid's bedroom.
- 3. How about anonymous adult ID cards? If you can do it
at all, you can do it in a form where the issuer checks that
you are an adult, gives you an ID, but does not record which
adult got which ID. If you trust the issuer.
- 4. What about self-identification by sites as a
defense--plus filtering software. DOJ mentions the
possibility, but it is not included in the defenses.
- F. Old technology: Is this different from
forbidding the publication of Playboy et. al. on the grounds
that they certainly get to minors, and cannot be kept from
- F. Note that one element of the bill is a
suppression of anonymity, at least for those who want to read
porn. But reading porn is an area where anonymity may be
important. Just ask Justice Thomas.
- II. Security issue: Macromedia Shockwave Multimedia
- A. Can read files on a machine that is
reading a (suitably designed) web site if it has path.
- B. Can read directory if it has hard disk
- C. How should such problems be reported,
given the conflict between the desirability of warning possible
victims and the undesirability of informing possible criminals
of their opportunities.
- III. Security Issue: ActiveX and Quicken.
- A. Chaos Computer Club of Hamburg in
January demonstrated an Active X control that, when downloaded
to a machine using Quicken for automatic bill paying,
automatically pays "bills" to the account of whoever provided
- B. Quicken claims 9 million plus users
- C. Microsoft response--don't download code
unless it is digitally signed.
- Furthermore, this situation cannot occur if
customers take advantage of the built-in security features in
many Internet browsers, such as Internet Explorer, that alert
users to the installation of an unauthorized or unsigned
ActiveX component. Customers who are concerned about the safety
of ActiveX controls should consider disabling the ActiveX
capability in their browser or using a browser such as Netscape
Navigator which does not support ActiveX.
- D. Note the two different strategies being
used to permit Web browsers to download software.
- 1. ActiveX strategy: The software can do lots of things,
some undesirable--but you know where it came from.
- 2. Java strategy: The software is limited to the
"sandbox"--can't write to your disk, etc.
- E. Symantek's response: use their software
to encrypt your Quicken files.
Can write to a computer's hard disk
an call on any server, not just the one where
Can access wide range of local computer resources
- IV. Chapter 6: Diffie.
- Who is Diffie? One of the inventors of the
idea of public key encryption.
- A. Diffie: Important point.
- 1. Most of what keeps us secure isn't high tech, it is
recognizing faces, voices, signature, keeping conversations
- 2. That is going away as we move to an online world.
- 3. Crypto substitutes--if permitted.
- 4. Constitution did not enumerate right of private
conversation--because it couldn't be violated. Smith quote.
- 5. Certainly it aids criminal conspiracies.
- B. How important is wiretapping?
- C. Encrypt LEEF, decrypt at other end?
Prevention requires govt control of equipment
- D. Govt or public innovation in
- 1. NSA may have the mathematicians, but ...
- 2. No evidence that they saw the practical uses--Public
Key, ECash, voting, etc.
- 3. Why should they--that's not their business.
- E. Lots of technologies that provide more
surveilance--except for DBS regulation, unregulated.
- F. We should have "Security in depth" for
democracy--meaning they not only require a court order, they
also can't do it if they have one.
- V. PGP Story:
- A. PGP for porn diary? What's wrong with
DES? You don't need a public key system for protecting your own
- B. Employee's EMail? Not a problem, unless
you don't want them to know that you feel free to read
- VI. Federal Encryption related law:
- A. Laws on govt procurement for
- B. ITAR (now Commerce)
- C. Invention secrecy act--govt may seal
patent applications if info detrimental to the U.S.
- VII. Perfect Crimes article
- A. Blind signatures
- 1. Don't worry about the math, save that a "one way
function" means ... that from x you can calculate f(x) but
from f(x) you cannot calculate x. the effect is
- 2. A pool of money at the bank, such that the bank can
confirm that a payment is an authorised one from the pool,
but cannot link the particular payment to the owner of
particular money in the pool.
- 3. Fancier version of the protocol I described.
- B. Allow untraceable transfers.
- VIII. Lots more stuff on the CDA.
- A. We have: Oral Argument, ACLU Brief,
Government Brief, Congressional supporter's amicus brief,
Chamber of Commerce amicus brief, others ...
- B. Arguments raised include ...
- C. Is screening by providers
- 1. Commercial web sites.
- a. Adult ID. No evidence of how they check or whether
it works. Obvious problem never discussed.
- b. Credit card verification. Would force samples
behind the wall, which makes sense from the sponsors'
- c. Works for small children, since parents have an
incentive to keep credit cards from them,
- d. Credit card companies know age?
- e. So it eliminates free commercial porn, but ...
- f. Define "commercial." Consider free sites at
present, with pointers to commercial sites. Advertising
vs subcontracting a fine line on the web.
- 2. Non-commercial web sites, such as AIDS info, gay
info, anti-censorship (ACLU has the dirty words up),
libraries with on-line journals, card catalogs, etc.
- a. For many of them, becoming commercial, charging
everyone, eliminates their function.
- b. Even if some credit card companies will do it for
non-chargers, they will charge to do so.
- c. So you need the adult ID
- d. Which doesn't work.
- e. So such sites either rely on what they do not
being covered (vagueness agmt) or ...
- f. Self censor."you can simply run it through some
sort of word processor or computer program to screen
--it's only text, after all, on cards, and if you find a
card that ..." (oral agmt)
- 3. Usenet News:
- a. No screening possible by the posters.
- b. Conceivably, newsservers could screen--but not
liable under the act, and unless all of them do it,
posters still liable.
- c. So they have to all set up web pages instead?
- d. Or self censor.
- 4. Listserves:
- a. Sending rather than making available, presumably,
- b. If anyone on the list has identified himself as a
- c. Everyone else must self censor. Ditto for
- d. Govt chat room mistake.
- 5. EMail.
- a. Not display, and only knowing counts?
- b. Do we want to criminalize locker room discussions
among highschool students if over EMail?
- D. Vagueness of standard:
- 1. One part says "indecent," another part has "patently
- 2. Govt claims that redeeming social etc. will come in
to show things are not patently offensive, but ...
- 3. What community? Some are easily offended.
- 4. Govt did not give a consistent story about what was
- 5. Which is a serious problem with criminal penalties.
- E. Overbroadness problems:
- 1. Adult tells library that his 17 year old son reads
the machine next to him, and he has an adult ID. " No, but
my motive is that I'm Anthony Comstock, and I don't want
this stuff to go out, so I'm telling you I've got a
17-year-old son who's going to help me police the airwaves."
- " You're saying that any adult has a heckler's veto on
the whole operation by simply saying I'm going to let my
child watch it?"
- 2. Adult lets his minor child watch on his machine--and
is a criminal.
- If we combine the display section and the knowingly
permit section, I take it that a parent who allowed his
computer, the computer that the parent owned, to be used by
his child in viewing offensive material, indecent material,
the parent would also go to prison, I take it.
- -- it's an offense to display the material, as I
understand it under the display section, where minors will
obtain it, and if a parent says I'm going to allow,
knowingly allow my computer to be used by my child to
observe these displays, isn't the parent therefore guilty of
the knowing, under the knowingly permit section? "any
knowingly "permit[ted]" use of a telecommunications facility
for such purposes"
- 3. it would be a crime, 2 years in jail, for a parent to
send an indecent E-mail message to the parent's 17-year-old
college freshman son or daughter.
- 4. Tagging option: Resisted by both sides, pushed by
court. (govt side) : "And it would be better than what we
have now, but it would not be either more effective or less
restrictive than the Communications Decency Act." (ACLU
side) "Well, I think it would raise significant compelled
speech questions." Wouldn't protect the site from liability
under the proposed rules.
- F. How restrictive is the additional
- 1. There are about 100,000 Web sites in all. And most
speakers cannot afford the $1,000 to $10,000 it costs to
have their own Web site. [cgi script capability needed to
screen for age. AOL etc. don't have it.
- 2. Discussion of what prohibitively expensive means.
Radio licensing as an example.
- 3. Cost of alternatives: Free for AOL etc.
- 4. Tagging: would not protect under these rules.
- G. Effectiveness:
- 1. Does 50% protection count (given the existence of
foreign sources of porn which we can't regulate)?
- 2. Don't filtering or tagging solutions
dominate--assuming that many foreign sites would voluntarily
- 3. At present, do foreign sites have "are you adult"
- 4. What if the statute claims to apply abroad too.
- H. Chamber of Commerce worries:
- 1. The imposition of sweeping criminal penalties based
on the content of communications on these networks, under
extremely broad and imprecise statutory standards, would
produce a chilling effect on business use of online
communications, and severely threaten the development of
this important infrastructure."
- 2. The nature of interactive computer communications
also renders the need for certain, bright-line standards of
potential liability crucial. Interactive computer
communications typically take place directly between users,
without the intervention of administrators or overseers, and
often take place in or in close proximity to "real time."
For this reason as well, the vast majority of the hundreds
of millions of messages that cross interactive computer
services each day are not, and cannot be, "screened," much
less fully reviewed by legal counsel.
- 3. Companies providing information regarding similar
products might be transmitting material considered "patently
offensive" or indecent to certain communities or by certain
prosecutors. Discussion of such conditions or treatments as
breast and prostate cancer, reconstructive surgery, penile
implants, digestive diseases and related conditions,
sexually transmitted diseases, contraceptives, birth
control, and any other product or service relating to
"sexual or excretory activities or organs" might be deemed
"patently offensive," subjecting those responsible for such
material to two years' imprisonment.
- 4. As noted, moreover, numerous information
technology-dependent companies employ 16-year olds and
17-year olds as interns or employees. See page 6, supra. The
mere use of expletives in e-mail sent to or available to
minor employees could subject such companies to liability.
- I. What about ISP liability?
- 1. (2) knowingly permits any telecommunications facility
under such person's control to be used for an activity
prohibited by paragraph (1) with the intent that it be used
for such activity, vs
- 2. No person shall be held to have violated subsection
(a) or (d) solely for providing access or connection to or
from a facility, system, or network not under that person's
control, including transmission, downloading, intermediate
storage, access software, or other related capabilities that
are incidental to providing such access or connection that
does not include the creation of the content of the
- 3. How does this apply to news servers? Government
points out that they can choose which groups to carry.
- IX. Sable v
FCC: One of
the closest things to a relevant precendent
- A. Sable operated
- B. Asked for declaratory and injunctive
relief against 1988 amendments to s223(b), which imposed a
blanked prohibition on indecent as well as obscene interstate
commercial telephone messages.
- C. District court rejected the claim that
"obscene" part was unconstitutional because it created a
national standard, but struck down "indecent" part. Not
narrowly enough drawn.
- D. Legislative history:
- 1. Previous version (223b) made it a crime to make
obscene or indecent calls for commercial purposes to a
minor, or anyone without that person's consent, asked FCC to
promulgate regulations for screening out minors.
- 2. FCC provided that after 9PM or requiring credit card
payment would qualify.
- 3. Carlin Communications v FCC, 2nd circuit cancelled
time restriction as "over and under inclusive", remanded to
- 4. Which changed to credit card or adult ID code,
requiring vendor to develop
- 5. Carlin II rejected that because inadequate
consideration of premises blocking.
- 6. Third version added possibility of message
scrambling, with unscramblers only sold to adults.
- 7. Carlin III district court approved, but said to
reopen if less restrictive technology appeared, but Appeals
court invalidated insofar as it applied to nonobscene speech
(on constitutional grounds)
- 8. Congress amended, 1988, to make apply to indecent and
obscene, all ages, no filtering needed.
- E. Supreme Court says:
- 1. Obscene part constitutional; no more a "national
standard" than mail acts.
- 2. Knowing who is calling from where may be costly, but
that is Sable's problem. There is no constitional impediment
to imposing costs on them.
- 3. But indecency part is out, because not sufficiently
- 4. Distinguished from Pacifica, both because of
- a. Time of day, not ban, and
- b. "Uniquely pervasive" nature of the medium, which
can intrude on the home without prior warning, (not true
of web) and ...
- c. "uniquely accessible to children, even those too
young to read." (Web pictures but not text?)
- d. Whereas here, medium requires the listener to take
affirmative steps to receive the communication. [like
- 5. Scalia points out that how much risk you are willing
to take of some children seeing material that is indecent
but not obscene depends in part on where you draw the line
between obscene and indecent.
- 6. Brennan, Marshal and Stevens hold that the obscene
part is unconstitutional too! The concept cannot be defined
clearly enough to provide fair notice to those who
distribute sexual material.
- F. Alternatives for the CDA, in light of
- 1. Do nothing--let parents use filtering software.
- a. Easy to do if you are willing to filter out the
internet, keep kids in the AOL sandbox.
- b. Fairly easy, but requires some cost and
initiative, if you are willing to do it imperfectly in
both directions (netnanny etc.)
- 2. Require front doors.
- Sable point about Pacifica.
- 2. Define tags, do nothing else, let existing law
- 3. Define tags, apply CDA with tag as a defense.
- 4. Adult ID, with penalties for letting it out.
- a. But porn site knows it?
- b. Not if we use public key encryption--porn site has
public key, adult has private.
- c. Does an adult who uses his ID to let a minor view
indecent material fall under the act?
- d. Requires all sites to have cgi scripting ability
- 5. Absolute ban, limited to obscenity. National
- a. Sable majority says that is all right, but ...
- b. Difference between technologies, between imposing
some cost and infinite cost?
- 6. What about regulating search engines to make porn
harder for kids to find?
- a. They all have cgi scripting abilities already.
- b. But kids can circulate dirty URL's.
- 7. What about some of these options, applied to
obscenity instead of indecency?
- a. Most of things we are most worried about closing
down (AIDS info, gay info, literature, etc.) would pass a
national standard for obscenity (redeeming social
- b. Most of the things they are most worried about
getting to kids would flunk.
- 8. Whatever the rules, can we enforce them against
foreign sites by making advertisers on a site liable?
- a. Legal problems.
- b. Current porn sites don't support themselves by
ads, perhaps because most companies don't want the
- X. Clipper III:
"Voluntary" infrastructure with escrow.
- A. This purports to be a draft the
administration is trying to find a sponsor for.
- B. It sets up a structure of registered
certificate authorities and key recovery agents. Does not
- C. A registered CA may only issue
certificates to people who have stored his private key
(sufficient info to recover plaintext) with a certified
- D. Any KRA, certified or not, must release
the key to a warrant or court order, subpoena, certificate
issued under the Foreign Intelligence Surveillance Act, or
"upon receipt of written authorization in a form to be
specified by the Attorney General"
- 1. The AG is supposed to write regulations to assure
that such authorization only issues when a government agent
is lawfully entitled to determine the plaintext and will use
the information for that purpose, to prove facts in legal
proceedings, or to comply with a request from a duly
authorized agency or a foreign government!
- 2. It is a criminal offense for any KRA, authorized or
not, to tell the owner of the key that it has been released
persuant to these procedures.
- 3. If a government agent knowingly obtains the
information without lawful authority, or knowingly misuses
or discloses it, the govt is liable for actual damages and
- E. Carrot:
- 1. A registered CA or KRA has a $10,000 ceiling on civil
liability for releasing information unlawfully, telling the
owner of a lawful release, or various other violations.
- 2. Compliance with the act is a complete defense to any
non-contractual civil act--for registered CA's and KRA's
- 3. Using encryption in furtherance of a criminal offense
(however small?) is a 5 years offense--but having your key
escrowed with a registered KRA is a defense.
- F. But ...
- 1. An unregistered CA can still exist, need not escrow,
and thus cannot be forced to give up anything.
- 2. But a CA needs to be big and visible, so perhaps
vulnerable to govt pressure?
- 3. And an unregistered KRA must still give up the keys,
- 4. May not tell the owner it has been forced to do so.
[These last two are things that never got adequately covered in
class, but I thought you might still be interested in; them I was
holding them to fill in if there was a gap in the papers.]
- XI. Randall Schwartz
- A. What did he do?
- 1. Take two password files, one from Intel, one from his
publisher, and use Intel computers to try to crack them--a
standard security precaution, but ...
- 2. He was not authorized to do so--the Intel computers
were not ones he was currently responsible for--and he did
not tell anyone he was doing it. He apparently got the Intel
file by using a password that he found on a different Intel
machine--which was a dictionary word.
- 3. He found lots of crackable passwords in the Intel
file, one in the publisher's file.
- 4. The publisher's sysop said she would have been
annoyed at him if she knew he was doing it.
- 5. Did he use any of the passwords?
- a. Only the one that was used to get at the password
file, so far as we know.
- b. He told the investigators where to find the
cracked passwords, badly underestimated the number of
passwords in it--suggesting that he wasn't paying
attention to the process.
- 6. He also maintained a gateway through Intel's firewall
so that he could access his EMail, etc.
- a. He originally established a gateway at Intel's
request for the use of a group that included people
- b. Maintained it for his own use (he was sysop for
the DNS machines).
- c. Was asked to improve its security, did so, later
to remove it from one machine, did so, but ...
- d. Later put it back on another. He says because he
thought that group's security rules were less stringent.
- e. He was operating through that gate (controlling
Intel machines from his O'Reilley account) when he was
- B. Relevant facts:
- 1. He was operating under an account issued to
him--implausible for a sophisticated criminal.
- 2. Intel was actively involved in the prosecution.
- 3. He had quit an earlier Intel contract early because
of a conflict with someone administering it.
- 4. One of the passwords cracked belonged to an Intel VP.
- 5. Schwartz is a big name in computer circles--author of
widely used books on PERL, teaches classes, etc.
- C. Allegations:
- 1. It was alleged by an Intel person and police, not
conceded by defense, that Randy said he wanted passwords in
case his account was terminated. Possibly confusion with
wanting the gate so that he could get at his EMail from
- 2. he was interviewed extensively by the police without
his lawyer. They did not use their tape recorder. They
reported their summary of what he said--much shorter than
the time of the interview.
- 3. They alleged that he said he had fantasized about
doing computer crime (probably true).
- 4. "I *now* understand why I'm not supposed to talk to
cops without a lawyer present, as the difference between
what I understood to say and what actually ended up on the
paper is nearly night and day."
- D. Issues:
- 1.To what extent can a firm use criminal law to enforce
its internal regulations?
- a. "Unauthorised" alteration of a computer--i.e. what
Randy was hired to do, if done without authorisation--is
- b. Nevada requires the employee's presumption of
authorization to be overcome by "clear and convincing
evidence to the contrary." Oregon doesn't say.
- 2. How does mens rea come into this?
- a. Running crack against password files is apparently
now a standard Intel security precaution.
- b. Does Randy have to have criminal intent to violate
- c. Is it theft to transfer a password file from one
Intel machine to another?
- d. Is cracking theft--of information?
- e. Prosecution claims intent to steal services (that
he needed passwords in case the account was closed), but
it is not clear if it was true.
- 3. What does Intel think it is doing?
- a. They know what was going on, and are sending a
message to employees and subcontractors to follow
- b. They really think he was trying to steal their
- c. Their left hand does not know what their right
hand was doing.
- XII. Remailer liability issues?
- A. Are you an accessory if remailer is used
for illegal purposes?
- 1. If you know that it sometimes will be?
- 2. Is the phone company?
- 3. But you are set up precisely to provide annonymity
- 4. But anonymity has legal purposes. Hooding laws?
- B. Only the last remailer in the chain is
at risk--so put it somewhere the law can't reach.
- 1. There could be a bogus last remailer
- a. you only get to use it once before its cover is
- b. But you could hold your fire while accumulating
- 2. If a case is cracked, police could find out after the
fact what remailers were used.
- C. Note that a remailer can always insert
additional links, cost considerations aside.
- D. Might be prudent to separate your
remailer persona from your realspace persona.
- E. And be prepared to scrap one remailer if
problems arise, and start another.
- F. Remailers don't need very much
reputation, if people routinely chain them.
- I. What have we
- A. What the new technologies of crime,
protection from crime, oppression, protection from oppression
- B. How they can or cannot be fitted into
existing legal rules, in particular:
- 1. The importance of which metaphor you choose.
- 2. The importance of technology in determing what laws
can be enforced
- 3. And what laws need be enforced: Private protection of
privacy via encryption.
- C. What threats and opportunities are
implicit in those technologies:
- II. Orwell Lesson:
He predicted the direction things were
going correctly, but badly overestimated how fast things would
change--an error intellectuals are prone to make.
- A. These developments are not here yet, but
- B. They are visible on the horizon.
Scientology conflict and Cornell case both give examples of low
level information warfare, for instance.
- C. Various people, in and out of
government, are maneuvering to deal with them.
- D. And you are, hopefully, ready to
understand the developments as they occur.
- III. Thanks for helping me
teach the class.
Have a nice summer.
Back to part I of the Outline
Back to Table of
Back to Course Home Page