O.S.S. Blogs & News >
Identity and Access Management (IAM) refers to a set of business processes and supporting technologies that enable the creation, maintenance, and use of a digital identity – i.e. it is about your users: who they are, how they are authenticated and what they can access.
November, 2015 – Open Systems Specialists Limited (OSS), a leading provider of I.C.T. Infrastructure and Services is celebrating the anniversary of its founding, marking twenty successful years of exceeding customer expectations by delivering exceptional services across most major industries in New Zealand.
I have been doing a lot of reading over the last couple of months as part of my role as the OSS Practice Manager. As you can see on the OSS website, we have four practices that we as a company specialise in. The reason for all the reading is firstly because I like to know what it is that our technical consultants are working with, and secondly because I need to know about trends and new products in the industry so we can provide the best service to our customers.
The information security landscape has changed dramatically over the last 10 years. It used to be the case that defending against outside attacks from the Internet with a rudimentary firewall was sufficient in almost all cases, and that intrusion detection was a straightforward case of monitoring log data for anything out of the normal.
I interact with clients (and potential clients) on a daily basis. It struck me a while ago that many of the issues they are struggling to cope with, they have brought upon themselves.
In February 2013 the Center for Strategic and International Studies (CSIS) released a paper titled Raising the Bar for Cybersecurity. In the paper it is stated that more than 90% of successful security breaches required only the most basic techniques; 85% of breaches took months to be discovered; and 96% of successful breaches could have been avoided if the victims had put in simple or intermediate controls.
We are all basically familiar with Von Neumann machines – instructions work on data, with instructions and data stored independently. This is the basic architecture of everything from a phone to a mainframe. We know we can use a single installation of a program to manipulate a multitude of corresponding datasets. For example an installation of Word can be used to edit any Word document.
We have become so familiar with the Von Neumann concept that we start to forget a key point:
Rapid data growth, increased infrastructure complexity, demanding Service Level Agreement's (SLA's), strict legal compliance, IT virtualisation and virtual machine (VM) sprawl, and the advent of cloud-based computing brings huge challenges to how data must now be managed and protected in the modern business.
Having confidence in meeting a system or application Recovery Time Objective (RTO) is vital. It is critical to understand and consider everything that needs to be operational to be able to recover in time to avoid business impact. This is always more than just asking “is it backed-up”.
CXOs have long understood the transformational value of getting decision support information into the hands of the business. The challenge is that Business Intelligence/Enterprise Data Warehouse solutions have historically been costly and time-consuming to implement, with a high level of uncertainty around the value of the data produced. Traditional solutions use an import-process-report model that does not provide the data when it is most valuable – in real-time. And the lead-time required to introduce new data sources and reporting can be of the order of days or even months, due to the need for staff with specialist skills to operate the system.