Rob Attoe is Nuix’s Global Head of Training and will be presenting Nuix Core Training on 5 December as part of HTCIA Asia Pacific Training Conference in Hong Kong.
- What are the major challenges facing digital investigators?
Digital investigations have changed significantly over the years. For example, in the early 2000s, we would examine every file and every piece of data. Today, we’re dealing with terabytes, petabytes and that means the way we process these exhibits has to change. So our biggest challenges are finding where data resides in cloud storage; coping with massive volumes of data; and knowing where to start the examination.
- How can the digital investigations community cope with the growing scale of data involved in investigations?
If we go back to the infancy of digital forensics back in the mid-to-late 1990s, we looked at every single bit and every byte on the drive. But those days are long gone when we could simply walk through a file system or folder structure to find our smoking gun evidence.
Now we need to use smarter tools that will allow us to cull those items. We need to use a methodology, a workflow that excludes the need to look at millions of files. That means initially looking at the files to see what story this particular evidence item is telling and then using smarter tools to focus on those items.
At the basic level there are techniques such as keyword searches and timeline analysis to understand the story the items are telling us. Then we have more advanced technology that allows us to focus on just the items that are relevant. For example, within Nuix, we have topic modelling and text summarisation, where I can take a whole series of files or emails and the computer automatically identifies what those documents are about. So I can very quickly determine their relevancy to an examination.
- How does data in the cloud affect digital investigations, practically and legally?
Data in the cloud is a real problem for investigators because we don’t know where the data is stored. It may be replicated across many different servers globally. So we can’t go to a cloud provider and ask to image the physical drive because, typically, multiple customers’ data are co-mingled on the same hard drive. Instead, we have to work with the data stored or cached on the user’s local hard drive.
It comes down to the experience of the examiner understanding how this data got there and finding any artefacts that show that a particular item has been opened, deleted and viewed. It is not only showing that the files exist, but showing culpability that the individual was aware of it and opened it up.
- Many organisations are boosting their security measures in response to high-profile data breaches. Does increased security make it harder or easier for investigators to do their job?
Improved security can make it easier. In theory, if an organisation has secured its data effectively, we only need to look in one place. We simply ask the security analyst, “Where does your policy say you should store this data?” Then, start looking in that place for information spillage, evidence of a breach, or an attack vector on the system. But the reality is, organisations often don’t know where their data is, and there are copies of sensitive information in places they shouldn’t be. So first we have to work out where the data leaked from and then go through the rest of the process.
- What other trends will impact digital investigators in 2015?
In digital forensics, we’ll see Microsoft releasing a new operating system, and that means a new set of policies and security settings to understand and deal with. More generally we’ll see a lot more operating systems using disk encryption by default. Digital examiners are now going to face more encrypted disks, so they’ll need to take that into consideration with their search and seizure techniques. If a machine is up and running when they seize it, imaging that device in place may become standard operating procedure.
- Where do you see investigative technology evolving in 2015?
We’ve started seeing a lot more push-button forensics; taking an image file or a collection of them and then letting the tools do the work and present the results back to examiners. A lot of examinations now can be cleaned up by simply running scripts and getting the tool to present the information.
Our industry is going through a complete rethink of how we do our forensic examinations. Forensic examiners have always thought we had to look at every bit, every byte. Trying to change this idealism in this industry is a bigger challenge than teaching people to use a new piece of software.
Whenever I teach forensics classes, I ask, “How many times have you been challenged in court as to how the data got there?” Out of a 100 examiners, maybe one or two had ever been challenged, once in their career, about how the data actually got onto the drive. So this obsession with deep forensics is pointless. Worse, it’s holding us back.
By contrast, new examiners coming out of college already understand big data. They understand different workflows and they don’t want to go down to the hex level. So, one of the biggest challenges is changing the mindsets of the existing examiners. We are getting there simply because people do not have the time to look at every bit and every byte anymore.