Basic Information Security and Privacy Glossary
Compiled by Barry Rein, MsC, CISSP
The probability that a system or information is available at a given time. Availability may be expressed in general terms as the reliability of a system, or more specifically as the percentage of time a system is up and running. In the latter case, this is usually given as a series of “nines”, i.e. five nines means the system is up 99.999% of the time.
The triad of confidentiality, integrity, and availability are the core principles of information security. Most security efforts are focused on maintaining these.
For information to be useful, it must be available to authorized parties at the time they need it. Methods like backups, database replication, or redundant network carriers are commonly used to ensure information remains available.
Challenges to availability might come from hardware or software problems leading to a system outage. Or, availability might be affected by a malicious cause such as a denial-of-service (DoS) attack.
Business Continuity Plan (BCP)
A plan for continuing an organization’s operations after an adverse event. Such an event could range from a computer virus to a full-scale disaster like a major earthquake.
The first step in developing a BCP is to perform a specialized risk assessment, called a business impact analysis (BIA). This consists of cataloging the systems, staff, and other resources that are necessary to a company’s operations. Then, a decision is made as to how soon each of these resources should be recovered to continue business.
The actual process is somewhat more involved than this, and there are many resources on the internet that explain it.
Chief Privacy Officer (CPO)
Executive responsible for managing the privacy of information within an organization. This role has become increasingly important under regulatory requirements such as HIPAA and PCI.
Depending on the size and mission of the organization, the CPO position may be combined with the CSO, or they may be separate.
Chief Security Officer (CSO)
Employee tasked with preserving the security of an organization’s information, often a separate position from the chief privacy officer. In some organizations the CSO is responsible for technical safeguards of sensitive information.
Limiting access to information to authorized parties. One of the three core principles of information security.
When considering confidentiality, information may come in many forms. It may be “at rest”, as in a database or shared drive. It may be “in motion”, as when traveling over a network. It may be on portable media such as a CD, USB flash drive, or backup tape. Information on paper – printouts and faxes – may also require confidentiality protection. Different forms of information will use different techniques to preserve confidentiality.
Confidentiality of information is often a regulatory requirement.
An important method for preserving confidentiality is encryption..
Denial of Service (DoS)
A malicious attack on an information system rendering the system unavailable for its intended use. One common form of DoS attack is when an attacker deliberately overloads a web server to the point where legitimate users cannot access it. If you rely on web servers, a DoS attack can seriously affect your ability to do business.
DoS attacks are sometimes called distributed denial-of-service (DDoS), because the attack comes from a large number of compromised sources.
A threat of a DDoS attack has been used in extortion attempts.
See http://www.cert.org/tech_tips/denial_of_service.html for a description of the types of DoS attacks, and measures you can take to manage them.
Disaster Recovery Plan (DR, DRP)
Plan for returning an organization’s systems to a normal operating state after a disaster. The difference between a DRP and a BCP is not always clear, as definitions vary. While a DRP often focuses on IT systems, a BCP is more comprehensive and will help you deal with a wider range of events affecting more of your business.
Ultimately, it doesn’t matter what you call it; what really counts is that you’ve developed some kind of a plan to help your business continue through an adverse event and come out the other side.
Transforming information so it cannot be read by anyone except an authorized party possessing what is usually called a key. Encryption is a powerful tool that is central to computer security.
Encryption is important to maintaining confidentiality of information*. A well-designed encryption algorithm will prevent anyone except the key-holder from reading a message.
Different forms of encryption are used for different purposes. For example, secure-sockets layer (SSL) is used to encrypt data during web transactions; and advanced encryption standard (AES) is often used to encrypt files that are being stored.
The strength of an encryption standard is measured by its key length. The longer the key, the stronger the encryption. As computing power increases, progressively longer key-lengths are necessary to remain secure.
The basic ideas of encryption are not hard, but the subject can rapidly become complex with symmetric, asymmetric, public keys, private keys, and hashing algorithms. For more info, start with Wikipedia: http://en.wikipedia.org/wiki/Encryption
*But not the only method. Other techniques such as access controls may also be used to preserve confidentiality.
A rule-based device or program intended to allow only specific network traffic to pass, and to block other traffic. A firewall, normally placed between an organization’s network and the public internet, is the first line of defense against unwanted traffic.
In its simplest form a firewall has two network connections, one of which will be connected to the internet, and the other to the internal network. In between, the firewall has a list of rules, set up by an admin, to filter the traffic.
One rule might say, for example, block web traffic from the internet. So, when web traffic arrives at the outside of the firewall, the firewall examines it, finds that a rule doesn’t allow it through, and so blocks it.
Another firewall rule might say, allow email traffic to come in, but only if it’s headed for the email computer. When email traffic arrives, the firewall examines it, and lets it in, but only if the destination is the email server.
See: http://streamer.checkpoint.com/extranet/pt/firewall_basics/ for an introductory lesson on firewalls.
Generally defined as protecting the confidentiality, integrity, and availability of data in any form. These three terms are sometimes called the C.I.A triangle, and are the central attributes of information security.
One of the triad of core principles of information security. Integrity gives the user assurance that the information has not been altered or damaged. Another view of integrity is that the information cannot be changed without detection.
Some ways of verifying the integrity of information are the use of checksums, hash functions, or other technical algorithms.
A good source of definitions, including integrity, is the national Institute of Standards and Technology Glossary of Key Information Security Terms
Intrusion Detection System (IDS)
A network device or program that monitors network traffic and logs and reports suspicious network activity.
An IDS is usually installed at the edge of an organization’s networks. Network traffic coming from outside is examined for malicious activity. The IDS may not only examine the contents of the traffic, it can also look for traffic with a specific signature pattern — a sequence of packets matching the profile of a known attack.
An intrusion prevention system (IPS) goes one step farther, and attempts to block suspicious traffic once it has detected it.
The principle that an individual should have no more access to information than is necessary for them to do their job. This is another basic principle of information security.
For example, a clerk who enters transactions into an accounting system should not be able to approve or change transactions, as that is not part of their job. The system should be set up so that the clerk is prevented from changing transactions on their own, without a manager’s approval. The clerk has the least privilege necessary to do their job, and no more.
The administrator is a user who has the maximum privilege on a system, and can thus do anything they want. For this reason, the admin role is one that must be carefully managed.
Most operating systems, and databases, have provisions to set up roles, so that a given type of user is limited in what they can do.
Need to know
See least privilege
Patch / Software Patch
A piece of software intended to fix a problem, such as a bug in an application or an operating system. Security patches are often issued to correct a security weakness.
Vendors will often issue a patch when a newly-identified security vulnerability is detected. It is important that the new patch is installed promptly to prevent the vulnerability from being exploited. On the other hand, patches have been known to cause problems in a system that is currently working properly. For this reason, it’s a good idea to test patches before installing them in all systems.
PCI DSS standards
Payment card industry data security standard (PCI DSS) is a set of security standards intended to reduce credit card fraud by protecting cardholder information, in organizations that handle credit, debit, and other types of payment cards.
The standard is defined by the Payment Card Industry Security Standards Council. All entities that process, store or transmit cardholder data must comply with PCI DSS.
PCI DSS consists of 12 requirement areas grouped under six control objectives. The requirements, such as having a firewall, restrict information access to need-to-know, and encrypting data in transmission, are all based on fundamental information security principles.
Compliance with PCI DSS will greatly reduce the risk of cardholder data being compromised, and has the additional benefit of raising the awareness of information security across an organization.
The official source of the PCI DSS standard is here: https://www.pcisecuritystandards.org/security_standards/
Personally Identifiable Information (PII)
Information that can uniquely identify an individual. In organizations that deal with HIPAA compliance, PII is also referred to as PHI, protected health information.
Some examples of PII are:
- Full name
- Social security number
- Photograph or fingerprints
- Street address
- Driver’s license number
- Credit card number
Many, if not all, forms of PII are required to be protected from disclosure by laws or regulations. Organizations holding PII must take serious measures to limit access to those with a need to know, and to prevent unauthorized parties from viewing it. It behooves the organization to take strong measures to protect it because a breach of PII may subject the organization to large fines and negative publicity.
High-level statement of an organization’s goals in a specific area of computer security. All organizations holding information of value to themselves or others should have a set of information security policies to guide staff, and to demonstrate management’s commitment to information security.
Detailed description of a method to implement a computer security policy. A procedure will contain step-by-step instructions on how to perform a process in a secure way in order to comply with the policy.
A good procedure should enable an employee to perform a process that they are not familiar with.
A review of the details of an organization’s information security posture intended to uncover security vulnerabilities and risks. A risk assessment may be a one-time event, but it may also be an ongoing process.
Risk assessments are normally performed as a gap analysis against an established security standard, for example ISO 27002, Code of practice for information security management.
The areas to be assessed are decided in advance. Gathering information in documents, by inspection or in interviews, the assessor compares the organization’s current security posture against the standard. The degree of compliance or noncompliance with each part of the standard is noted, and at the end a report is produced noting any gaps found.
The report may then be used as a basis for closing the gaps; a process called remediation.