+ All Categories
Home > Documents > Malware Triage/ Using Open Data to Help Develop Robust ...€¦ · Using Open Data to Help Develop...

Malware Triage/ Using Open Data to Help Develop Robust ...€¦ · Using Open Data to Help Develop...

Date post: 27-Apr-2018
Category:
Upload: nguyenhanh
View: 226 times
Download: 3 times
Share this document with a friend
80
Malware Triage Using Open Data to Help Develop Robust Indicators
Transcript

Malware TriageUsing Open Data to Help Develop Robust Indicators

Hello, My name is:Sergei Frankoff

[email protected]

Sean Wilson

[email protected]

OPENANALYSIS.NET

What is an IOC

Indicators of Compromise (IOCs) are forensic artifacts of an intrusion that can be identified on a host or network.

openioc.org (http://openioc.org/resources/An_Introduction_to_OpenIOC.pdf)

IOC Formats

We Aren’t Talking

About Form

ats!

Traditional View of IOCs

In Practice…Is APTx attacking us? I saw

this frightening article in CISO monthly magazine…

WTF?? Ransomewarejust infected half of

accounting??

Wait didn’t I just remove this same trojan from Dave’s

workstation last week?

Hey look I just hooked that IOC feed to our IDS…

OMG! Our IDS just blocked traffic to all of our developers!

A Possible Solution: Triage

What is it exploiting?

Is it malicious?

Suspicious URL

Suspicious E-mail

Intel feed

Do we have exposure?

Security Event

Incident!

A Better Solution: Triage + IOCs = Automation!

What is it exploiting?

Is it malicious?

Suspicious URL

Suspicious E-mail

Intel feed

Do we have exposure?

Security Event

Incident!Filter knowns

IOCs Root cause analysis

IOCs Remove The KnownsAnd Reveal The Unknown

Unknown

Known Good

Known Bad

The Problem With AVArtemis!1A5E05B1B9E1

Artemis!262BC0AE2FB0

Artemis!04296F13925B

Artemis!110C43F8A337

Artemis!9BFC61456261

Artemis!9BE792AC4667

Malware Specific IOCs

IOCs:Indicators of Compromise.

Malware Specific IOCs: forensic artifacts resulting from the presence or execution of malware.

AV Signatures

Robust IOCs

Lifetime of Malware Family

Effe

ctiv

enes

s of

IOC

Diversity ofMalware

Brittle IOC

Robust IOC

Robust is not …One is the loneliest

number : (

More Robust Is…Multiple Samples

+Comparative

Analysis!

Reverse engineering with a Sandbox!

Most Robust Is…Multiple Samples

+Code Review

+Comparative

Analysis!!

Reverse engineering with IDA and a Debugger!!

The Key is Comparative Analysis

Primary Sample

Sample #1

Sample #2

Pivot (Attribute)

This is one of the most important slides

in the presentation.

Building Robust Indicators

IdentifyPivots

DiscoveryMining Open

DataAnalysis(Triage)

Test(Validate)

DevelopIOC

ComparativeAnalysis

Analysis Triage

Is it malicious?Can we identify the malware family?Collect static attributesCollect dynamic attributes

Is it Malicious / What is it?

Binarly

VS.

Static Attributes

Hmm… Something isn’t right, there are no file properties for

this executable?

I’m totally legit!

Static Attributes: Metadata

Compiler Artifacts

EXIF Data

Easily Modified Can make poor indicators

Sample Discovery Can work as primary indicators

Static Attributes: Imports

Compilation

Library and API Imports

Use Imphash!

Static Attributes: StringsAnalysis with

Context

Analysis without Context

Try our free PE analysis tool PFTriage!

This file is packed! We aren’t going to get any useful static attributes.

Static Attributes Identified

• Initial Sample was packed with UPX

• Contains no file or version metadata

Packer / Crypter vs. Static Attributes

Packer Stub

Obfuscated Payload

Payload

Low Quality Static Attributes

Packer / Crypter Weakness: Runtime!

A

X B

C

PE Runtime

Win

dow

s A

PI

A

X B

C

PE Runtime

Win

dow

s A

PI

vs.

1

2

3

4

5

1

2

3

4

5

Sandbox Magic

A

X B

C

PE Runtime

Win

dow

s A

PINetwork

Filesystem

Registry

Process

Synchronization

Services

Sandbox Process Monitor

When Your Sandbox Doesn’t Work…Ghetto Runtime Analysis

OR +

Dynamic AttributesIn-Memory Strings

Process Handles / Mutex

Access / Created Files

Registry Keys

Network Traffic

Level Up! Your Analysis With Some Light Debugging

Quickly trace the sample in a debugger to deobfuscate strings and gain CONTEXT.

Try Windbg!

Dynamic Attributes Identified

• Creates Mutex: QKitMan2016_1

• Creates Registry Key: HKEY_CURRENT_USER\SOFTWARE\QKitMan2016

• Requests IP from IPReq using HTTP GET Request

• Post IP as Payload to LiveJournal account qkitman1010

Identify Pivots

Collect your notesChoose best pivotsPrepare to hunt

Rough Notes Are OK

VS.

Discovery (Hunting)

Searching for related samples

Mining open data (the easy way)

Acquiring samples

Mining Open Data

virusshare.com

Tricks For Searching Online Sandboxes

• Shared Virtualization Infrastructure

• Shared Templates

• Hunting using computed Values

Mining Open Data With OAPivot

Sample Acquisition

Sandbox Shared Samples

Sharing Services

DFIR Lists and Trust groups

Sandbox Shared Samples

Share your samples!

Sharing Services

VirusShare

Comparative Analysis

Identify common characteristics

Common Properties

Common Behaviour

This is a key section even though there

aren’t a lot of slides.

Comparison ChecklistInitial Sample Pivot Sample A Pivot Sample B

Strings X X

Exif Data

Imphash

Memory Strings X X X

Mutex X X

File Names X

Registry Keys X X X

Network Traffic X X

Some network strings remain constant between all samples while others differ!

The mutex changes between samples but only slightly… maybe we can work with that.

Look at that! The same registry key for all samples… I remember that key from the STRINGS too!

Level Up! Your Comparative Analysis With Some Light Disassembly

Comparative analysis works at the byte code level as well!

The opcodes of the string building algorithm are identical. Don’t forget to use wildcards for variable bytes (0D)!!

Develop IOC

Choose IOC Format(s)

Develop IOC

IOC Formats

We Really Really

Aren’t Talking About

Formats!

There are tons of links to great free IOC training on our site : )

Testing

Test IOCs Against Known Bad

Test IOCs Against Known Good

Automate Discovery

Known Bad

Indicator types dictate how they are tested.

Run indicators against repository

of known bad

Validate!Update when required

Known Good

Validate!Resolve Issues

Test indicator against a repository of known

good samples

Try testing IOCs against your corporate “Golden image(s)”.

Monitor

When indicators stop matching new samples something has changed!

Test Automation

Waiting for samples to hit you organization to test

indicators

Automating discovery before your organization

is affected

vs.

Key TakeawaysTriage + IOCs = Automation!

Robust IOCs can be built without the need for a debugger or disassembly.

Comparative analysis is key!

Open data can be leveraged to collect related samples. Try OAPivot…

Remember to continuously test your IOCs.

Try It Yourself

http://bit.ly/2frHKg3

763c7763a55b898b9618a23f85edfab6

Thank you : )

And don’t forget… openanalysis.net

Image Attribution• Noun Project - Molecules by Zoë Austin

• Noun Project - Funnel by Vaibhav Radhakrishnan

• Noun Project - Shield by AFY Studio

• Noun Project - Checklist by Arthur Shlain

• Noun Project - Rocket Man by LuisPrado

• Noun Project - Head-Desk by Karthik Srinivas

• Noun Project - Checked Database by Arthur Slain

• Noun Project - Kevin Augustine LO

• Noun Project - Compilation by Richard Slater

• Noun Project - Database Warning by ProSymbols

• Noun Project - Flow Chart by Richard Schumann

• Noun Project - Debug by Lemon Liu

• Noun Project - Network by Creative Stall

• Noun Project - File Settings by ProSymbols


Recommended