Month: November 2016

Apple black Friday event 25/11/2016

Apple black Friday event 25/11/2016

hello everyone. Apple is having a massive sale on Friday.

If you are interested in apple product now is the time.

If you are a part of the cult of Apple mark the date 25/11/2016 on your calendar.

good luck to you all and may the odds ever be in your favor.

system requirements document in story form example

system requirements document in story form example

the humble story of Tafe course enquirer

 

A long time ago, there was this man TAFE who was sitting on a bench. You can see the sadness in his eyes by his troubled past.

 

“I’m sorry TAFE we know what you’ve done for us, but this is the end. We are going public; we will look for other people as well”

 

Voices of the past echoes in his head. After everything he has done, after everything he has given, in the end it didn’t matter.

 

“for almost 2 centuries, my family have given 500,000 enrolments, 1000 courses and this is what I get? This is not fair!”

 

He shouted in to the air hoping people will hear his agony but no one was there to listen. He laughs off… like a maniac, wondering his eyes in the disbelief of change. He knows that he has to go private now that things have changed. But was not confident he can keep up with the competitions around. With their shiny promises to lure people in. He gets up from his chair and wonders through the park where he hears a sudden voice.

 

“looks like you’re having a bad day TAFE”

 

He looks around and sees an old man sitting at a bench shuffling tarot cards on his hands. He moves to the bench and sits next to the old man. He does not know why but somehow this Oldman has irresistible charm.

 

“pick three cards”

 

The old man said. TAFE upon hearing this picked three cards.

 

As he looked at the first card with a sword crossing.

“Hmm… looks like you are worried about competitions”

 

As he looked at the second card with an anchor.

“with your way of doing things you don’t seem to have much chance on people asking you questions, finding out what you can offer or keep in contact”

 

As he looked at the third card with a wall.

“It looks like your customers need to go through a wall of obstacles to find what they need through you”

 

TAFE was stunned. The cards were spot on about the worries he has been having.

“pick another three cards”

 

TAFE’s hands automatically moved towards the cards. As he tried to pick the cards the old man grabbed TAFE’s hands.

“maybe I’ll just show you”

 

A vision went through TAFE’s eyes. He did not understand anything until he looked at the calendar. TAFE was in the future. He seemed to have done well for himself. The way he is doing things now seems way different from how he did things before. TAFE was curious how, he quickly jumped in to his computer and searched for his website. When he accessed it, he saw something amazing. The search engine has been replaced so that people can search it by keywords like google. After coming this far he got more curious so he searched favourite course. There he found a subscription button. He pressed the button and an email popped up on his phone showing his subscription.

 

“brilliant!!”

He shouted, he then browsed some more and found an enquiry button.

 

“what’s this?”

TAFE asked himself. He clicked the button and it was a text message box.

 

“this is it?”

TAFE was surprised, the system was stripped to the bare minimum that he didn’t think was possible. He then submitted an enquiry and got a respond in record time. TAFE was marvelled at his own system when suddenly he hears a voice at the door.

 

“enjoying yourself, TAFE?”

The old man was standing on the door, only this time he looked familiar as if like an old friend.

 

“some fortune teller you are… who are you?”

The old man answered

“I am your future, no more will we bounded by EBS and its shackles of documents. No more we will be bound by long and boring process. No more we will be scared of competitions…”

The old man spoke and he stared into TAFE’s eyes and said.

 

“my name is TAFECourseEnquirer”

At that time TAFE woke up from his bench. He noticed he never moved around the park in the first place. It all seemed like a dream.

 

“it doesn’t have to be a dream”

He muttered to himself as he stood up and walked on eyes full of determination.

Why should we use dummy data instead of live data?

Why should we use dummy data instead of live data?

Dummy data is benign information that does not contain any useful data, but serves to reserve space where real data is nominally present. It can be used as a placeholder for both testing and operational purposes.

Dummy data must be rigorously evaluated and documented to ensure that it does not cause unintended effects.

There are obvious pro’s and con’s when using dummy data for testing.

Pro’s:

  • Easy to create a dummy set of data for testing as and when needed.
  • There is no need to obfuscate live data.
  • Testers can create the data they need without depending upon other teams.
  • A smaller data set can be created to test against where the testers know exactly what data exists (controlled sample).

Con’s:

  • Dummy data cannot fully replicate every single type of data that exists in production, thus defects could be missed.
  • Using a smaller data set means that load test results may not reflect the size of production data (web page/web service response times).
  • Processing times on a smaller dataset will not accurately reflect what will happen in production (e.g. on an Oracle Financials database).

Most organizations are still using live data in test and development environments because of a lack of awareness around data security, and they don’t know they can easily mask or de-identify sensitive data using off-the-shelf technologies without changing applications or testing processes.

Even when the awareness is there, organizations still tend to rely on real data for its speed and ease of use.

Using live, cloned data is generally regarded as a shortcut when there isn’t enough time or resources to create test data, or a secure test data strategy isn’t in place.

But these are not excuses for a practice that can put customer data in great jeopardy. It is true that in general these test systems are not Internet-accessible, but even if you have absolute trust in all your employees — never a good starting point — that doesn’t remove the risk, as many organizations will outsource parts of development and hire contractors, consultants, and the like.  And if the media has taught us anything over the last decade about carelessness, it’s that people often store this type of data on laptops and removable media devices, and those assets can get lost or stolen.

Beyond the insider threat, there’s also the very real possibility that malicious external hackers can eventually work their way deep enough into the network after a blended attack and get their hands on test applications and live data.

The biggest change in recent years is the legislation requiring live data to be obfuscated on pre-live environments.  The challenge is to replicate live issues on non-live environments, and to test on live-like data prior to releasing code to production. Failure to do so can lead to defects being uncovered in production, just due to a deficiency in the actual data or the volume of the data used on a test environment.

It’s a challenge but one that cannot be ignored. Either you use hand-crafted dummy data, or obfuscated live data – either way, you cannot just take live data and test it unchanged!

 

Under what circumstances will you hold your TFN

Under what circumstances will you hold your TFN

Tax file numbers are unique numbers issued by the Australian Taxation Office (ATO) to identify individuals, corporations and others who lodge income tax returns with the ATO.

Once you get a tax file number (TFN), you need to keep it safe. Your TFN is how we identify you for tax and super. It’s yours for life, so don’t let anyone else use it – not even friends or family.

If someone else uses your TFN, it can cause serious problems because they could use your name illegally and you could be convicted of a crime.

Your TFN information must only be used or disclosed by TFN recipients for:

  • a purpose authorised by taxation law, personal assistance law or superannuation law
  • the purpose of giving you TFN information that they hold about you.
  • lodge a tax return
  • apply for income assistance or support payments, such as pensions or benefits from DHS (which administers the Centrelink, Child Support and Medicare Programs) or DVA
  • start a new job or change jobs
  • have savings accounts or investments that earn income (eg interest or dividends)
  • receive a payment under the Higher Education Loan Program
  • join a superannuation fund.

TFNs may not be used by a financial institution to confirm your identity.  You must make sure you keep your TFN information in a safe place.  It should be properly destroy any TFN information that you no longer need. This will help prevent other people stealing your identity. You should report a lost or stolen TFN, or unauthorised access of your TFN information to the ATO.

The Privacy (Tax File Number) Rule 2015 (TFN Rule) outlines how your TFN information should be collected, stored, used, disclosed and kept safe. All people, agencies, organisations and other entities that are allowed to ask for your TFN information must follow the TFN Rule.

The people, agencies, organisations and other entities that are allowed to ask for your TFN information must not record, collect, use or pass on your TFN unless this is permitted under taxation, personal assistance or superannuation law.

There is no law in Australia that says you must give an authorised person, agency, organisation or other entity your TFN if they ask for it. However, sometimes there may be financial consequences if you don’t give your TFN to someone who is allowed by law to ask you for it.

 Examples

  • If you are claiming or receiving a personal assistance payment from DHS (such as a pension, benefit or allowance) they may ask for your TFN to check your information with the ATO and other agencies that make payments.
  • If you do not give DHS your TFN, certain personal assistance payments may not be paid to you. Providing your TFN is a condition of receiving most Australian Government personal assistance payments.
  • If you don’t give your employer, bank, other financial institution or superannuation fund your TFN, it may affect how much tax you pay and could result in tax being deducted from your income or your interest payments at the highest marginal rate.
  • Your superannuation fund may ask for your TFN to facilitate the location and combination of your superannuation accounts. If you decide not to quote your TFN, the fund may not be able to find any additional accounts that you may have.

When an authorised person, agency, organisation or other entity asks you for your TFN, they must tell you:

  • why they are collecting it (including the name of the law or laws that allow them to collect your TFN and the purpose for which they are collecting it)
  • that it is not an offence if you do not give them your TFN
  • what will happen if you do not give them your TFN.

This information must be included in any forms that ask you for your TFN. The description of the purposes for collection can be reasonably general as long as it adequately informs you of what the law authorises the person, agency, organisation or other entity to do with your TFN.

If you consider someone has not handled your TFN information properly, you can make a complaint to the OAIC. And before you can make a complaint to the OAIC, you must first make your complaint to the person, agency, organisation or other entity you consider has mishandled your TFN information.

 

Why thinking of information security makes sense?

Why thinking of information security makes sense?

To fully understand why information security is important, we need to understand both the value of information and the consequences of such information being compromised.

The Value of Information

To understand the value of information, let’s start by examining some typical information held by both businesses and individuals. At the very least, businesses will hold sensitive information on their employees, salary information, financial results, and business plans for the year ahead. They may also hold trade secrets, research and other information that gives them a competitive edge. Individuals usually hold sensitive personal information on their home computers and typically perform online functions such as banking, shopping and social networking; sharing their sensitive information with others over the internet.

As more and more of this information is stored and processed electronically and transmitted across company networks or the internet, the risk of unauthorised access increases and we are presented with growing challenges of how best to protect it.

Protecting Information

When you leave your house for work in the morning, you probably take steps to protect it and the contents from unauthorised access, damage and theft (e.g. turning off the lights, locking the doors and setting the alarm). This same principle can be applied to information – steps must be put in place to protect it. If left unprotected, information can be accessed by anyone. If information should fall into the wrong hands, it can wreck lives, bring down businesses and even be used to commit harm. Quite often, ensuring that information is appropriately protected is both a business and legal requirement. In addition, taking steps to protect your own personal information is a matter of privacy retention and will help prevent identity theft.

Information Breaches

When information is not adequately protected, it may be compromised and this is known as an information or security breach. The consequences of an information breach are severe. For businesses, a breach usually entails huge financial penalties, expensive law suits, loss of reputation and business. For individuals, a breach can lead to identity theft and damage to financial history or credit rating. Recovering from information breaches can take years and the costs are huge.

For example, a well published information breach occurred at the popular clothing company in AUS during 2014-2015, when over 45 million credit/debit cards and nearly 500,000 records containing customer names and driver’s license numbers were compromised. This information is believed to have been compromised due to inadequate protection on their wireless networks, leaving the information exposed. The final costs of the breach are expected to run into the $100’s of millions and possibly over $1 billion.

So, that is why thinking of information security makes sense.

Sensitive data ,Cloud data and the data privacy

Sensitive data ,Cloud data and the data privacy

Sensitive Data and Data privacy
Sensitive information is data that must be protected from unauthorized access to safeguard the privacy or security of an individual or organization.
There are three main types of sensitive information:
Personal information: Sensitive personally identifiable information (PII) is data that can be traced back to an individual and that, if disclosed, could result in harm to that person. Such information includes biometric data, medical information, personally identifiable financial information (PIFI) and unique identifiers such as passport or Social Security numbers. Threats include not only crimes such as identity theft but also disclosure of personal information that the individual would prefer remained private. Sensitive PII should be encrypted both in transit and at rest.
Business information: Sensitive business information includes anything that poses a risk to the company in question if discovered by a competitor or the general public. Such information includes trade secrets, acquisition plans, financial data and supplier and customer information, among other possibilities. With the ever-increasing amount of data generated by businesses, methods of protecting corporate information from unauthorized access are becoming integral to corporate security. These methods include metadata management and document sanitization.
Classified information: Classified information pertains to a government body and is restricted according to level of sensitivity (for example, restricted, confidential, secret and top secret). Information is generally classified to protect security. Once the risk of harm has passed or decreased, classified information may be declassified and, possibly, made public.
Data privacy, also called information privacy, is the aspect of information technology (IT) that deals with the ability an organization or individual has to determine what data in a computer system can be shared with third parties.
Information technology is typically seen as the cause of privacy problems, there are also several ways in which information technology can help to solve these problems. There are rules, guidelines or best practices that can be used for designing privacy-preserving systems. Such possibilities range from ethically-informed design methodologies to using encryption to protect personal information from unauthorized use.
There are also several industry guidelines that can be used to design privacy preserving IT systems. The Payment Card Industry Data Security Standard (see PCI DSS v3.0, 2013, in the Other Internet Resources), for example, gives very clear guidelines for privacy and security sensitive systems design in the domain of the credit card industry and its partners (retailers, banks). Various International Organization for Standardization (ISO) standards (Hone & Eloff 2002) also serve as a source of best practices and guidelines, especially with respect to security, for the design of privacy friendly systems.
Although data privacy and data security are often used as synonyms, they share more of a symbiotic type of relationship. Just as a home security system protects the privacy and integrity of a household, a data security policy is put in place to ensure data privacy. When a business is trusted with the personal and highly private information of its consumers, the business must enact an effective data security policy to protect this data. The following information offers specific details designed to create a more in depth understanding of data security and data privacy.
Data security is commonly referred to as the confidentiality, availability, and integrity of data. In other words, it is all of the practices and processes that are in place to ensure data isn’t being used or accessed by unauthorized individuals or parties. Data security ensures that the data is accurate and reliable and is available when those with authorized access need it. A data security plan includes facets such as collecting only the required information, keeping it safe, and destroying any information that is no longer needed. These steps will help any business meet the legal obligations of possessing sensitive data.
Data privacy is suitably defined as the appropriate use of data. When companies and merchants use data or information that is provided or entrusted to them, the data should be used according to the agreed purposes. The Federal Trade Commission enforces penalties against companies that have negated to ensure the privacy of a customer’s data. In some cases, companies have sold, disclosed, or rented volumes of the consumer information that was entrusted to them to other parties without getting prior approval.

Cloud Data Privacy
Cloud computing is one of the most important current trends in the field of information and communications technology, and ICT management. Elements of cloud computing are Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). The term cloud computing derives from the cloud symbol usually used to represent the internet and the complex infrastructure behind it in graphics.
Hardware and software are no longer procured and operated by users themselves but obtained as services. Cloud service providers enable users to access and use the necessary ICT resources via the internet. To provide these resources, providers often fall back upon other providers in the cloud, which, for example, make storage capacity available for customer data or computer capacity for data processing.
Cloud computing services are used both by consumers as well as by organisations and companies. Offers in cloud computing comprise, among other things, the provision of calculating and storage capacity; the provision and operation of development environments and of operating- and database-management systems; of web hosting; of web mail services; and of a growing number of different types of application software; for word processing and other office applications; customer relationship management; supply chain management; or for the storage and management of photos or personal health-related data (electronic health records), to name a few.
There are several steps you can, and should, take to ensure the security of your corporate data when moving to the cloud.
The issue of data privacy is at the forefront of everybody’s mind. Television commercials advertise security products and news programs frequently describe the latest data breach. Public perception aside, any organization has a legal obligation to ensure that the privacy of their employees and clients is protected.
Laws prohibit some data from being used for secondary reasons other than the purpose for which it was originally collected. You can’t collect data on the health of your employees, for example, and then use it to charge smokers with higher insurance premiums. Also, you can’t share certain data with third parties. In the world of cloud computing, this becomes much more difficult, as you now have a third party operating and managing your infrastructure. By its very nature, that provider will have access to your data.
If you’re collecting and storing data in the cloud and it’s subject to the legal requirements of one or more regulations. You must ensure the cloud provider protects the privacy of the data in the appropriate manner. In the same way as data collected within your organization, data collected in the cloud must only be used for the purpose for which it was initially collected. If the individual specified that data be used for one purpose, that assurance must be upheld.
Privacy notices often specify that individuals can access their data and have it deleted or modified. If the data is in a cloud provider’s environment, privacy requirements still apply and the enterprise must ensure this is allowed within a similar timeframe as if the data were stored on site. If data access can only be accomplished by personnel within the cloud provider’s enterprise, you must be satisfied they can fulfill the task as needed.
If you’ve entered into a click-wrap contract, you’ll be constrained to what the cloud provider has set out in these terms. Even with a tailored contract, the cloud provider may try to limit the data control to ensure its clients have a unified approach. This reduces the cloud provider’s overhead and the need to have specialized staff on hand. If complete control over your data is a necessity, you need to ensure this upfront and not bend to a cloud provider’s terms.
Cloud security architecture is effective only if the correct defensive implementations are in place. An efficient cloud security architecture should recognize the issues that will arise with security management.

Deterrent controls
These controls are intended to reduce attacks on a cloud system. Much like a warning sign on a fence or a property, deterrent controls typically reduce the threat level by informing potential attackers that there will be adverse consequences for them if they proceed. (Some consider them a subset of preventive controls.)

Preventive controls
Preventive controls strengthen the system against incidents, generally by reducing if not actually eliminating vulnerabilities. Strong authentication of cloud users, for instance, makes it less likely that unauthorized users can access cloud systems, and more likely that cloud users are positively identified.

Detective controls
Detective controls are intended to detect and react appropriately to any incidents that occur. In the event of an attack, a detective control will signal the preventative or corrective controls to address the issue.[8] System and network security monitoring, including intrusion detection and prevention arrangements, are typically employed to detect attacks on cloud systems and the supporting communications infrastructure.

Corrective controls
Corrective controls reduce the consequences of an incident, normally by limiting the damage. They come into effect during or after an incident. Restoring system backups in order to rebuild a compromised system is an example of a corrective control.

Overengineering in IT projects

Overengineering in IT projects

Software developers throw around this word, “overengineering,” “That code was overengineered.” “This is an overengineered solution.” Strangely enough, though, it’s hard to find an actual definition for the word. People are always giving examples of overengineered code, but rarely do they say what the word actually means.

The dictionary just defines it as a combination of “over” (meaning “too much”) and “engineer” (meaning “design and build”). So per the dictionary, it would mean that you designed or built too much.

Designed or built too much?  And isn’t design a good thing?

Well, yeah, most projects could use more design. They suffer from underengineering. But once in a while, somebody really gets into it and just designs too much. Basically, this is like when somebody builds an orbital laser to destroy an anthill. An orbital laser is really cool, but it (a) costs too much (b) takes too much time and (c) is a maintenance nightmare. I mean, somebody’s going to have to go up there and fix it when it breaks.

The tricky part is–how do you know when you’re overengineering? What’s the line between good design and too much design?

When your design actually makes things more complex instead of simplifying things, you’re overengineering.  An orbital laser would hugely complicate the life of somebody who just needed to destroy some anthills, whereas some simple ant poison would greatly simplify their life by removing their ant problem (if it worked).

This isn’t to say that all complexity is caused by overengineering. In fact, most complexity is caused by underengineering. If you have to choose, the safer side to be on is overengineering. But that’s kind of like saying it’s safer to be facing away from an atom bomb blast than toward it. It’s true (because it protects your eyes more), but really, either way, it’s going to suck pretty bad.

The best way to avoid overengineering is just don’t design too far into the future. Overengineering tends to happen like this: Somebody imagined a requirement that they had no idea whether or not was actually needed. They designed too far into the future, without actually knowing the future.

Now, if that developer really did know he’d need such a system in the future, it would be a mistake to design the code in such a way that the system couldn’t be added later. It doesn’t need to be there now, but you’d be underengineering if you made it impossible to add it later.

With overengineering, the big problem is that it makes it difficult for people to understand your code. There’s some piece built into the system that doesn’t really need to be there, and the person reading the code can’t figure out why it’s there, or even how the whole system works (since it’s now so complicated). It also has all the other flaws that designing too far into the future has, such as locking you into a particular design before you can actually be certain it’s the right one.

There are lots of common ways to overengineering. Probably the most common ways are: making something extensible that won’t ever need to be extended, and making something way more generic than it needs to be.

A good example of the first (making something extensible that doesn’t need to be) would be making a web server that could support an unlimited number of other protocols in addition to HTTP. That’s kind of silly, because if you’re a web server, then you’re sending HTTP. You’re not an “every possible protocol” server.

However, in that same situation, underengineering would be not allowing for any future extension of the HTTP standard. That’s something that does need to be extensible, because it really might change.

The issue here is “How likely it is that this thing’s going to change?” If you can be 99.99% certain that some part of your system is never going to change (for example, the letters available in the English language probably won’t be changing much–that’s a fairly good certainty) you don’t need to make that part of the system very extensible.

There are just some things you have to assume won’t change (like, “We will never be serving any other protocol than HTTP”)–otherwise your system just gets too complex, trying to take into account every possible unknown future change, when there probably won’t even be any future differences. This is the exception, rather than the rule (you should assume most things will change), but you have to have a few stable, unchanging things to build your system around.

In addition to being too generic on the whole-program level, individual components of the program can also be too generic. A function that processes strings doesn’t also have to process integers and arrays, if you’re never going to be getting arrays and integers as input.

You don’t have to overengineer in a huge way, either, to mess up your system. Little by little, tiny bits of overengineering can stack up into one huge complex mass.

Good design is design that leads to simplicity in implementation and maintenance, and makes it easy to understand the code. Overengineered design is design that leads to difficulty in implementation, makes maintenance a nightmare, and turns otherwise simple code into a twisty maze of complexity. It’s not nearly as common as underengineering, but it’s still important to watch out for.

Australian Census 2016 Who Is Liable?

Australian Census 2016 Who Is Liable?

It was a dreary day in August, you were trying to finish work early to get home before the Census night deadline or risk paying a fine of $180 or more. You rushed home and sat down in front of the computer, ready to do the survey but only to find out you are unable to access the Census website. Repeatedly, you kept on refreshing your browser hoping for the contents to load but the error message kept on appearing instead like a broken record. There is something obviously wrong and later that night you found out from social media that the Australian Bureau of Statistics (ABS) website has been experiencing outages causing you and the rest of Australia grief. You are less than impressed and immediately blamed ABS for wasting your valuable time. But are they the main culprit for this chaos?

The Australian Bureau of Statistics, per its namesake, is a government agency responsible in gathering data from the public to understand Australia’s inhabitants and its needs. Collected data are analysed and are referenced by the government and the community enabling better planning on social, economic, environmental and population matters. ABS enlisted the tech giant IBM for a staggering contract of $9.6 million to host and manage traffic for the 2016 online Census. Various tests were done prior to the Census being kicked off, however, come Census night the website crashed continually. It was a frustrating incident even for the Malcolm Turnbull, who expressed his disappointment and specifically called out both IBM and ABS. Of course, if you are the client (ABS) your natural instinct is to point the finger towards the direction of the person you are paying to do the job (the service provider, IBM).

So what went wrong? Initial reaction from ABS was inferring to DDoS (Distributed Denial of Service) attack or in plain English, their systems were compromised. But further investigation suggests, their servers could not cope with the surge in traffic from users trying to access the website all at the same time. IBM was dragged through the mud, their integrity tainted and heavy criticisms came left, right and centre because of the Census’s poor design and architecture. In essence, IBM’s lack of due diligence caused them dearly and perhaps future multi-million dollar government deals in pipeline. Though it makes me wonder how a veteran technology company, who has withstood the test of time, messed up substantially and out of all clients, the most bureaucratic…the government!

Ensuring data security with cloud encryption

Ensuring data security with cloud encryption

 

fijx3pjopcew62uiuhhm

Cryptography has been with us since the dawn of human civilization.  People have wanted to keep sensitive information…

from prying eyes long before the invention of the complex, computer-based encryption methods that we utilize today.  The ancient Greek protected their secret messages by tattooing them on the shaved head of a messenger.  The messenger’s hair would grow back while traveling to their destination and render the message invisible.  The receiver of message would just need to know a good barber in order to read the secret message upon arrival.

So what does this have to do with companies putting sensitive data in the cloud?  Just like the   ancient Greek, we are trying to keep our secrets safe from prying eyes.  The methods have changed, but the goal remains the same.  One of the best ways to ensure confidential data is protected in the cloud is to utilize encryption for data in transit and data at rest.  There are still potential issues with encryption that need to be considered when investigating cloud services.  Almost all cloud service providers support encryption for data in transit, but few offer support for data at rest. The cloud encryption capabilities of the service provider need to match the level of sensitivity of the data being hosted.

 

download

Cloud encryption options                                                        

The basic business model of the typical cloud services provider is based on the idea of scalability:  The more customers that can utilize shared resources the better the profit margin for the cloud services provider. This idea works in reverse as well:  The more customers that can utilize shared resources, the lower the cost paid by each of the customers.  These facts play a critical role in the decision of the cloud provider to offer encryption services.  Encryption consumes more processor overhead, so it lowers the number of customers per resource and increases overall costs.  Most cloud providers will only offer basic encryption on a few database fields, such as passwords and account numbers, for this reason.  There are usually options available from the cloud provider to encrypt the entire database, but this will dramatically increase cost to the point where cloud hosting is more expensive than internal hosting.

Some cloud providers have been offering alternatives to encryption that don’t have the same performance impact.  These techniques include redacting or obfuscating confidential data.  This can sound appealing, but is just another form of “security through obscurity:”  Neither technique is effective in securing confidential data because both are easily bypassed.

Another cloud encryption alternative that may be offered by service providers in order to reduce the encryption performance penalty will be its own custom encryption solution.  This is a major red flag for potential customers for several reasons.  The current encryption standards have been thoroughly tested and verified over many years and by many brilliant engineers and cryptographers.  A cloud service provider is unlikely to fund this level of development of a proprietary encryption standard and won’t receive the same level of public scrutiny and feedback as the currently accepted standards.  This creates the strong possibility of a cryptographic mistake, which could leave the customer data vulnerable to exposure.  Proprietary encryption standards should be avoided at all costs.

The cloud provider that offers a standard-based encryption solution may still have other risks that need to be considered.  Encrypted data is only as secure as the private key used to encrypt it.  Key management becomes a critical issue and the cloud provider must have policies and procedures in place for storage, generation and archival of private keys.  It’s important to keep in mind that anyone that possesses that private key has access to your confidential data.

ssl-cloud-big-data-security1

Additional cloud encryption considerations

There are still other operational encryption issues that must be considered when utilizing a cloud service provider.  These operational processes include the policies and procedures for the encryption of tape backups and other removable media, such as DVD-R and USB devices.  Your data may be safely encrypted in the provider’s database, but if it uses unencrypted media in its operations you may still be at risk of exposure;  it’s important to understand these operational risks before putting your data in the provider’s care.

Finally, there are still other areas where technology does not permit encryption.  The actual processing of the data by the cloud provider will require that the data be decrypted at some point.  This may be changing with the advent of homomorphism encryption, which was demonstrated by IBM in 2009 and allowed data to be processed while still being encrypted.  This is a future technology, but it would certainly increase the security capabilities of cloud providers.

 

Cloud encryption and compliance

So the million-dollar question becomes: “Should regulated data be put into the cloud?”  It’s certainly possible to maintain compliance with regulations while utilizing cloud services.  Encryption plays a big role in compliance as many regulations require specific data elements to be encrypted.  This type of requirement is present in GLBAPCI DSS and HIPAA, to name a few.  The most important guidance on encryption is publically available from NIST 800-111 and FIPS-140-2.  These standards can help you evaluate the encryption capabilities of a cloud provider for compliance with regulations.

Encryption is a powerful tool that can be used effectively to protect a company’s confidential data in the cloud.  It’s important for a company to investigate and understand how the cloud provider utilizes encryption in their operational procedures.  Only then can a company confidently utilize cloud providers knowing that their confidential data is protected by encryption.  Modern encryption algorithms far surpass the protections that were available to the ancient Greeks for their sensitive data –and no one will need their head shaved.

Cloud Concepts and the Impact on Business Analysts

Cloud Concepts and the Impact on Business Analysts

Cloud Computing is generating significant interest and momentum. The cloud eco-system requires new considerations for the business analysis community to fully take advantage of cloud computing opportunities.

This article provides links to key NIST Reference Architectures to provide a stable cloud foundation, identifying just some of the key considerations for business analysts in a “cloudy world”.

 

Cloud Basics
Almost all cloud computing approaches use the National Institute of Standards for Technology (NIST), an agency under the U.S. Department of Commerce, as a core foundation from which solutions are defined.
The accepted NIST definition for cloud computing is: 

 

“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics, three service models, and four deployment models.”
Figure 1.0 represents a simplified model, containing the characteristics, service models, deployment models and very important hosting options that also need careful consideration for business analysts.

 

mag11-1

Figure 1.0 NIST Cloud Basic Core Terminology
Hosting considerations are separate from the deployment models. For example, you can have a Private Deployment that is hosted internally, within your organization OR hosted externally by a third party. This is a critical consideration when the organization may desire a private cloud, but does not want the physical infrastructure within their facility.
The NIST Cloud Computing Reference Architecture is a “role-based” perspective. There are five basic roles: cloud consumer, cloud provider, cloud auditor, cloud broker and cloud carrier.
Business analysts can provide valuable insights to clarify opportunities and identify associated risks that require consideration to optimize the solution.
Most frequently, the business analysts will clarify the business requirements for the cloud consumer and help identify the “best fit” cloud provider. The business analysts may perform “functional fit” analysis to determine the configuration, customization and acceptance testing efforts for the selected solution. Business analysts may also participate in value, pricing and costing analysis.
That brings us to a common concern that impacts cloud adoption: “security and privacy”.
Security and Privacy are of Everyone’s Concern
In 2013 the updated NIST Reference Architecture evolved to clarify that security and privacy are not just of concern to the cloud provider, but to all the roles in the cloud ecosystem.

Therefore, figure 2.0 represents a draft model, better communicating that privacy and security are “cross cutting” considerations.
mag11-2

Figure 2.0 Draft Updated NIST Cloud Computing Reference Architecture for 2013
NIST is also generating a Cloud Computing Security Reference Architecture that is due for release April 2014.
Other NIST Cloud Computing publications available or being worked on include: Cloud Roadmaps, Security & Privacy, Service Level Agreements, Cloud Carrier, Metrics and Standards. Many documents are available and free for download from www.nist.gov.
Pricing and Costing
Cloud computing solutions can result in tremendous cost savings when planned and implemented properly. When poorly planned and/or pricing is blurred (obscured due to clouds), then cloud computing benefits may be realized, or it may result in higher costs than traditional solutions.
Pricing and costing of cloud computing solutions is often complex with various pricing options, service levels and terms and conditions that directly impact core decisions moving forward.
Business analysis can help provide core context and considerations that help estimate the Total Cost of Ownership, which is a core measure to help focus value and cost discussions.

Of course, many other benefits exist from cloud, but price is often at the forefront.

Summary

Value-focused business analysis will remain a core and necessary activity to balance business needs and the technical solutions. However, the role will evolve to analyze cloud computing solutions in conjunction with business objectives, opportunities and risks.
A Guide to the Business Analysis Body of Knowledge® (BABOK® Guide) already contains the basics for cloud computing analysis. Some extensions and clarification will be developed as the cloud eco-system becomes more accepted and common place. Many organizations such as NIST are clarifying the cloud models, opportunities, risks and plans.
The cloud computing market share will continue to increase, therefore business analysts will need to be comfortable with the cloud eco-system, recognizing specific opportunities and challenges in the cloud.