Site Details

I have a CSV file to upload

(Available during site creation) Select to add one or more sites by uploading a CSV file with site details. Then browse to your CSV file. Note that your file may include any number of rows where each row includes details for a single site. Follow the formatting guidelines outlined on the screen.

Title

Provide a title for your site.

Site URL

Enter the URL for your site. The URL must include http:// or https://.

Scan Settings

Assign Tags

(Optional) Assign tags to your site as a way to organize your sites into logical groups. You can select from existing tags. You'll be able to schedule scans and create reports for tagged sites. Click the drop down icon to select tags. Once selected, tags are highlighted in the tag tree. Click outside the tree to add the selected tags to the site. Want to define new tags? It's easy - just go to the Asset Management (AM) application.

 

Maximum Pages

The maximum number of pages the web crawler will scan for malware.

Go to Account Information to see the maximum pages allowed for your account. Click the About link at the bottom of the screen, and then go to Malware Detection.

Scan Intensity

The scan intensity level to be used by the scanning engine. Each level represents multiple settings including the maximum number of processes used to scan a single host and packet delay.

High. Scan performance is optimized for maximum bandwidth use resulting in the fastest possible scan time. As compared to the other levels, more crawling and testing requests are run in parallel and the delay between requests sent to the web application is shorter. Scans at a High performance level may be faster to complete but may overload your network, web server or database. This scan can potentially make your site unavailable during the scan.

Medium. Scan performance is optimized for medium bandwidth use.

Low. Scan performance is optimized for low bandwidth use. This level is the recommended setting.

Header Injection

Identify headers that need to be injected by our service to scan the web site. This option is intended to be used when a workaround is needed for complex authentication schemes or to impersonate a web browser.

Enter header information in the field provided. You can enter a maximum of 131,072 characters.

Enter each header in the format: <header>: <text>.

Multiple headers may be entered. Each header must be on a separate line.

Example 1

To bypass a complex login form (for example, for multi-step authentication or CAPTCHA), where mwf_login is the session identifier for the application:

Cookie: mwf_login=2-e3b930b2cf6549d0351346d3cf56e9ae

Example 2

To bypass a complex login form (for example, for multi-step authentication or CAPTCHA), where ASPSESSIONIDAARTTCBQ is the session identifier for the application:

Cookie: ASPSESSIONIDAARTTCBQ=BGHDNEICDKJBGJFMOIAOPLAG

Example 3

To use a personalized user agent:

User-Agent: Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) AppleWebKit/420+ (KHTML, like Gecko) Version/3.0 Mobile/1A543a Safari/419.3

Some web applications display different information for different user agents. For instance a web application accessed by a mobile device will display light content containing different functionality, links, forms and underlying HTML code. For this reason, the scanning engine may find different vulnerabilities.

Example 4

To bypass basic authentication:

Authorization: Basic bXl1c2VyOm15cGFzc3dvcmQ=

When a header such as the above is provided, the header basic authentication overrides any authentication record with basic authentication defined.

Crawl Exclusion Lists

Allow List

The allow list identifies the links (URLs) in the web application that you want to be scanned. For each string specified, our service performs a string match against each link it encounters. When a match is found, our service submits a request for the link. When only an allow list is configured (no exclude list), no links will be crawled unless they match an entry in the allow list. 

The allow list can consist of URLs and/or regular expressions.

URLs. Select the check box to enter the URLs for the allow list. Each URL must be a fully qualified domain name. Enter each URL on a new line. You can enter a maximum of 2048 characters for each URL.

Regular Expressions. Select the check box to enter regular expressions for the allow list. Enter each regular expression on a new line. For example, specify /my/path/.* for all URLs under the /my/path/ directory. You can enter a maximum of 2048 characters for each regular expression.

Exclude List

The exclude list identifies the links (URLs) in the web application that you do not want to be scanned. For each string specified, the crawler performs a string match against each link it encounters. When a match is found, the crawler does not submit a request for the link unless it also matches an allow list entry.

The exclude list can consist of URLs and/or regular expressions.

URLs. Select the check box to enter URLs for the exclude list. Each URL must be a fully qualified domain name. Enter each URL on a new line. You can enter a maximum of 2048 characters for each URL.

Regular Expressions. Select the check box to enter regular expressions for the exclude list. Enter each regular expression on a new line. For example, specify /my/path/.* for all URLs under the /my/path/ directory. You can enter a maximum of 2048 characters for each regular expression.

POST Data Exclude List

The POST data exclude list identifies URLs for which you want to block form submission, as this could have unwanted side effects like mass emailing. URLs are specified as regular expressions. When specified, the service blocks form submission for any URL matching the regular expression patterns and does not submit the blocked URLs to POST data (for example, form fields) during all scan phases.

Regular Expressions. Select to set up a list of regular expressions to identify URLs with form submissions you want to block. Enter each regular expression on a separate line in the field provided. You can enter a maximum of 2048 characters for each regular expression.