Fork me on GitHub

Web Interface

The web interfaces provides an easy way to schedule jobs. The framework itself does not provide for periodic scheduling. The web interface is designed to support bulk URL processing. Results can be viewed in the web interface. Also, an XML feed is provided.

Jobs submitted outside of the web interface (for example with the ‘hc’ utility) are not visible in the web interface.

Submitting a job

A job can be submitted via the ‘Create New Job’ option in the main menu. In this manual an example job will be created that processes two URLs, the benign page http://www.honeyspider.net and a malicious Metasploit exploit page. To follow along, use Metasploit to start an old Internet Explorer exploit with as payload the exec module. Set the CMD option to ‘calc.exe’ to start a Windows Calculator after the exploit succeeds.

createjob-1

First create a new file on you local system with both URLs (one per line). In the ‘Create New Job’ page you can first select the workflow you want to use for the URL’s. A short description for the selected workflow will appear below this option. Next a ‘Feeder file’ has to be selected. This is the file with the URLs you want to analyze. The workflow optional parameter options can be used to adapt the default parameters for the services in the chosen workflow. Click the ‘Show/Hide workflow XML’ link to view the XML for the currently selected workflow.

createjob-2

Next we will configure the scheduling options. You can run a job once or multiple times. In the screenshots the job is scheduled to run every 30 minutes. The scheduling can be adapted later via the ‘Jobs Schedule’ option in the main menu.

createjob-3

In the job details options you can specify the job name. It is also possible to make the job visible only for the current user (admin users are also able to view all jobs, so take care with giving away admin rights in the ‘Admin’ pages).

When clicking on the ‘Save’ button the job (schedule) will be created. Also, a link to the RSS feeds for all and the non-benign URLs will appear.

Viewing and editing a Job Schedule

A job schedule can be viewed via the Jobs Schedule option in the main menu.

schedule-1.png

When clicking on a job schedule in the list, you can view and edit the schedule.

schedule-2.png

With the grey buttons on top you can:

  • Edit schedule: Change the settings specified when creating the job schedule

  • Disable schedule: Disable the schedule. You can enable the schedule again later

  • Delete schedule: Delete the current job schedule.

You can view the jobs that have already run based on this schedule in the list displayed at the bottom of the page.

Viewing analysis results

Results can be viewed via the Jobs Overview option in the main menu. All jobs generated based on the job schedules appear in the list.

results-1

When clicking on a job instance based on our ‘webinterfacedemo’ job schedule the job details are displayed.

results-2

The URLs are searchable via the ‘filter’ field. As can be seen, the http://www.honeyspider.net website is classified as a ‘BENIGN’ URL. Our Metasploit exploit page however is classified as ‘MALICIOUS’. Let’s click on this malicious URL to view more details.

results-3

The URL analysis details shows more information. In the ‘URL summarized by analyzer’ field all the analyzers that have run (based on the selected workflow) display their results. In this case both the capture and js-sta (javascript) analyzers classify this page as malicious. Let’s click on ‘capture’ for the results of the capture analyzer.

results-4

After clicking on the grey ‘+ All’ button at the bottom right of this page the details reported by the capture reporter are shown. You can see the changes on the system in the Capture HPC sandbox analyzer having visited the website. For now, click on the Log file.

results-5

You can clearly see that a new ‘calc.exe’ process was started, the payload of our Metasploit exploit.

Other analyzer results can be viewed in the same way.