When I work with clients and we discuss CIS Critical Control 2, their focus is often on inventorying their installed software.  Today we'll talk about inventorying software that you didn't install.  Malware is typically the primary target in that “we didn’t install that software” list.

The method we're looking at today will inventory the running processes across the enterprise, and we'll look at how to "sift" that information to find outliers - applications that are running only one or two (or 5 or 10%, whatever your cutoff is) of hosts.  Note that this is hunting for *running* software, not software that was installed with a traditional MSI file, so this does a good job of finding malware, especially malware that hasn't spread too far past its initial infection hosts yet.

OK, let's look at the base code.  We're basically running get-process, getting the path on disk for that process, then hashing that file on disk.  If the hash operation errors out (which it will for file-less malware for instance), that file is saved to an error log.  The hash is the key item, it uniquely identifies each file, even if malware has replaced a known filename the hash will be different on that station.  You can then use this hash to reference back to malware IOCs if that's helpful.  Note that the hash in this case is SHA1 - you can change this to meet whatever your hashing requirements are, or add a few different hashing algorithms if that works better for you.

# collect the process list, then loop through the list
$proc = @()
Foreach ($proc in get-process)
    {
    try
        {
        # hash the executable file on disk
        $hash = Get-FileHash $proc.path -Algorithm SHA1 -ErrorAction stop
        }
    catch
        {
         # error handling.  If the file can't be hashed - either it's not there or we don't have rights to it
        $proc.name, $proc.path | out-file c:\temp\proc_hash_error.log -Append
        }
    }

We'll then run our script across the entire organization, and save both the process data and the errors in one set of files. Because we're hashing the files, its likely better (and certainly much faster) to run this operation on the remote systems rather than opening all the files over the network to hash them.

Note that when we do this we’ll be logging the error information out to a remote share.

function RemoteTaskList {
# collect the process list, then loop through the list

$proc = @()
$wsproclist = @()
Foreach ($proc in get-process)
    {
    try
        {
        # hash the executable file on disk
        $hash = Get-FileHash $proc.path -Algorithm SHA1 -ErrorAction stop
        $p = $proc
        $p | add-member -membertype noteproperty -name FileHash -value $hash.hash
        $p | add-member -membertype noteproperty -name HashAlgo -value $hash.Algorithm
        $wsproclist += $p
        }
    catch
        {
         # error handling.  If the file can't be hashed - either it's not there or we don't have rights to it
         # note that you will need to edit the host and share for your environment
        $env:ComputerName,$proc.name,$proc.path | out-file \\loghost\logshare\hash_error.log -Append
        }
    }
    $wsproclist
}

$targets =get-adcomputer -filter * -Property DNSHostName
$DomainTaskList = @()
$i = 1
$count = $targets.count

foreach ($targethost in $targets) {
   write-host $i of $count -  $targethost.DNSHostName
   if (Test-Connection -ComputerName $targethost.DNSHostName -count 2 -Quiet) {
       $DomainTaskList += invoke-command -ComputerName $targethost ${function:RemoteTaskList}
       ++$i
       }
   }

$DomainTaskList | select-object PSComputerName,Id,ProcessName,Path,FileHash,FileVersion,Product,ProductVersion, HashAlgo | export-csv domain-wide-tasks.csv

With that CSV file exported, you can now look at the domain-wide list in Excel or any tool of your choice that will read a CSV file.

===============
Rob VandenBrink
Coherent Security

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.
 
Linux kernel CVE-2019-12817 Local Privilege Escalation Vulnerability
 

Posted by InfoSec News on Jun 26

https://www.nasdaq.com/article/the-hack-effect-the-effect-of-data-breaches-on-the-nasdaq-cta-cybersecurity-index-cm1167886

By Nasdaq
June 24, 2019

Cybercrimes have become more prevalent in the last few years as people,
corporations, and governments become increasingly digitized. According to
Bloomberg, there have been 195 data breaches worldwide that have been
reported (some of which go all the way back to 2005), where at least 1
million...
 

Posted by InfoSec News on Jun 26

https://www.axios.com/cybersecurity-problems-federal-agencies-faca2bc1-bc46-47d0-9d4d-01c734583be9.html

By Joe Uchill
Axios
June 25, 2019

A Senate subcommittee analysis of a decade of annual inspectors general
reports shows that at the 7 worst-performing federal agencies, known
cybersecurity issues can linger for as long as a decade.

The big picture: The report, compiled by the Permanent Subcommittee on
Investigations, tracked cybersecurity...
 
Nessus CVE-2019-3961 Cross Site Scripting Vulnerability
 

Posted by InfoSec News on Jun 26

https://www.davidfroud.com/the-ceo-cybersecurity-challenge/

By David Froud
Fround on Security
June 24, 2019

It is with thanks to Chad Loder that I write this blog. His post on
LinkedIn made me laugh out loud and is what inspired me to propose the CEO
Cybersecurity Challenge (#ceocybersecuritychallenge). The very simple post
was:

From: Security Team
To: All Employees
Subject: Security Awareness Training

To opt out of this...
 

Posted by InfoSec News on Jun 26

https://www.law360.com/articles/1169751

By Bonnie Eslinger
Law360
June 25, 2019

A jury in London found a former UBS compliance officer and her trader
friend guilty of insider dealing in a partial verdict on Tuesday, a win
for the Financial Conduct Authority in its long-running prosecution of the
pair.

The jury of eight men and four women at Southwark Crown Court reached
their decision after deliberating for seven days, ultimately finding...
 

Posted by InfoSec News on Jun 26

https://news.vice.com/en_us/article/9kxvqd/us-cyberattack-on-iran-just-put-a-target-on-american-businesses

By David Gilbert
Vice News
June 25, 2019

President Donald Trump came within minutes of starting another war in the
Middle East last week when he ordered, and then abruptly canceled, a
missile strike against Iranian bases. Instead, he launched another strike:
a long-planned cyberattack, designed to quietly cripple Iran’s missile...
 
(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.
 
The Microsoft logo displayed at Microsoft's booth at a trade show.

Enlarge / Microsoft at a trade show. (credit: Getty Images | Justin Sullivan)

Microsoft is launching a new layer of security for users of its OneDrive cloud storage service. OneDrive Personal Vault is a new section of your storage that's accessed through two-step verification, or a "strong authentication method," although Microsoft didn't define the latter term.

Microsoft notes that fingerprinting, face scans, PINs, and one-time codes by email, SMS, or an authenticator app are among the acceptable two-step verification methods. And you’ll automatically get de-authenticated after a period of inactivity—that's the key to Microsoft's special security argument here. Two-factor authentication using text or email is less secure than other options. Using the more heavy-duty face or fingerprint verification will require the appropriate hardware, such as a device with Windows Hello.

It also has options for transferring physical documents to the OneDrive mobile app. You can scan documents or take photos directly into the Personal Vault section without needing to store the file in a less secure part of your device first.

Read 4 remaining paragraphs | Comments

 
Internet Storm Center Infocon Status