Command status: Active
Supported by OpenApps API: Yes
Supported by Internal/Reseller API: Yes
Possibly queued processing: Yes (Always)


This command is designed to allow retrieval of large numbers of referring domains, beyond the 100,000 maximum of GetRefDomains or can return the top backlinks for each referring domain.

Resources consumed




50000 + 20 per each referring domain (referring domain counts can be retrieved from GetIndexItemInfo RefDomains column)


This resource will be decreased by actual number of rows of data retrieved (returned) by this command.


Parameter Description


Required: must be set to: DownloadRefDomainBackLinks


Optional - defaults to historic
Either: "fresh" - to query against Fresh Index, or "historic" - to query against Historic Index.


Required: index item that is being queried using same convention as for GetIndexItemInfo, examples: - URL - subdomain - root domain


Optional: can be set to 1 to 10
Default: 10
Defines the maximum number of URLs per referring domain to be returned


Optional: can be set to 0 or 1
Default: 0
Includes or excludes deleted links, 0=show all links including deleted, 1=exclude deleted links from results


Analysis will be automatically aborted for any index item that will have greater analysis cost than specified. This is useful to prevent human errors in trying to download backlinks for items that are too large.

Default: 1000000


Important: notification URL must be accessible from outside your intranet - do not specify internal servers that can't be assessed from our servers, if you do then you will never get the notification you expect!

Optional: if specified this URL (with HTTP/HTTPS protocols only) will be requested to notify you that the download has been fully prepared. The URL you provide can contain query string parameters to help you identify the download request that was made, we will also substitute (if they are present) the following macro variables (case sensitive): %%DOWNLOAD_FILE%% - will be changed to the download filename %%DOWNLOAD_FILE_LOCATION%% - will be changed to the download filename location using PublicDownloadLocation variable below

Your notify URL should respond with single piece of data: OK (2 characters - no HTML) to indicate that you have successfully received this notification. Any other response (including server failures on your side) will be treated as an error. In the case of an error, the notification URL will be called again a number of times using exponential backoff before failing.

Example of notification url:

(this URL doesn't actually exist!)


Important: this FTP URL must be accessible from outside your intranet - do not specify internal servers that can't be assessed from our side: if you do then you will never get the upload!

Optional: if not specified prepared file with backlinks will be uploaded to, this behaviour can be overridden by specifying your own FTP server using appropriate URI format such as: with the relative path of the upload directory or with the absolute path of the upload directory.

Make sure that the user specified is allowed to write files in the directory that you designated for these uploads. If a trailing filename is specified, this will be prepended to the output filename when it is upload to your server.

If not specified the upload will be made to from where you will be able to download the file.


Optional: by default prepared backlinks file will be available from URL: (note: directory listing is denied by design there) - IF you use alternative FTP location and know which public URL would correspond to it then you are advised to supply it here unless you are planning to analyse the data locally.

Sample queries and response

Requesting a domain

This is a protocol-level example query that uses a special URL that was overridden to have zero cost of analysis (you will need to use your own API_KEY to analyse other urls):

XML response

<Result Code="QueuedForProcessing" ErrorMessage="" FullError="">
<GlobalVars ETA="n/a" IndexBuildDate="2017-09-04 13:42:54" IndexType="0" JobID="00E6E6992898B46C91D3F722430413DD" ReportName="DownloadRefDomainBackLinks" ReportPosition="1" ServerBuild="2017-10-13 13:57:22" ServerName="SHADOJAGUAR" ServerVersion="1.0.6495.23321" TotalReports="1" UniqueIndexID="20170904134254-HISTORICAL" UserID="895472"/>

JSON response

  "Code": "QueuedForProcessing",
  "ErrorMessage": "",
  "FullError": "",
  "ETA": "n/a",
  "IndexBuildDate": "2017-09-04 13:42:54",
  "IndexType": 0,
  "JobID": "3E5FBCFC70DE33A553C71AD88222A253",
  "ReportName": "DownloadRefDomainBackLinks",
  "ReportPosition": 1,
  "ServerBuild": "2017-10-13 13:57:22",
  "ServerName": "HUMMERR",
  "ServerVersion": "1.0.6495.23321",
  "TotalReports": 1,
  "UniqueIndexID": "20170904134254-HISTORICAL",
  "UserID": 895472

This response indicates that the request was queued for asynchronous processing. Note: JobID value is returned in this case. This response indicates that this request has already been requested recently and data files were prepared for it - see GetDownloadsList command how to check for those data files.

Requesting specific data files via GetDownloadsList

XML response

<?xml version="1.0" encoding="UTF-8"?>
<Result Code="OK" ErrorMessage="" FullError="">
   <GlobalVars IndexBuildDate="2013-02-24 21:47:24" IndexType="1" ServerBuild="2017-11-06 16:39:24" ServerName="XSTHEMACHINE" ServerVersion="1.0.6519.29982" UniqueIndexID="20130224214724-FRESH" />
   <DataTables Count="1">
      <DataTable Name="Downloads" RowsCount="1" Headers="Description|JobID|Status|Created|Expires|LastUpdated|FileSize|LastError|PublicDownloadLocation|AnalysisProgressInfo">
         <Row>API Job: DownloadRefDomainBackLinks (3E5FBCFC70DE33A553C71AD8822|3E5FBCFC70DE33A553C71AD88222A253|4|16/10/2017 09:43:45|23/10/2017 09:43:45|16/10/2017 09:43:53|6483009| ||</Row>

JSON response

  "Code": "OK",
  "ErrorMessage": "",
  "FullError": "",
  "IndexBuildDate": "2013-02-24 21:47:24",
  "IndexType": 1,
  "ServerBuild": "2017-11-06 16:39:24",
  "ServerName": "XSTHEMACHINE",
  "ServerVersion": "1.0.6519.29982",
  "UniqueIndexID": "20130224214724-FRESH",
  "DataTables": {
    "Downloads": {
      "Headers": {},
      "Data": [
          "Description": "API Job: DownloadRefDomainBackLinks (3E5FBCFC70DE33A553C71AD8822",
          "JobID": "3E5FBCFC70DE33A553C71AD88222A253",
          "Status": "4",
          "Created": "16/10/2017 09:43:45",
          "Expires": "23/10/2017 09:43:45",
          "LastUpdated": "16/10/2017 09:43:53",
          "FileSize": "6483009",
          "LastError": "",
          "PublicDownloadLocation": "",
          "AnalysisProgressInfo": ""

The response indicates where data files are located. Please note that it is possible to supply your own FTP location for such uploads and also it is highly recommended to use NotifyURL functionality that will call your web application to confirm that processing of such long running request is over.

Returned values

Global variables
Return value Description
Code Code indicating whether the command failed.
ErrorMessage A message explaining the error. This will be blank if there is no error.
FullError Verbose explanation of error.
IndexBuildDate Date/time when the index which the command queried was built.
IndexType Code indicating whether the index queried was Historical (0) or Fresh (1).
JobID Unique identifier for this job.
ReportName Name of the command called.
ReportPosition Position of the files to download in the download queue.
ServerBuild Date/time the server was queried.
ServerName Name of the server queried.
ServerVersion Version of the server queried.
TotalReports Number of reports in the download queue.
UniqueIndexID String consisting of date/time query was executed and which index it was conducted on.
UserID ID of the user who executed the query.

CSV contents

A CSV file with UTF-8 encoding format can now be downloaded from PublicDownloadLocation shown in XML above.

Value Description
Target URL URL which backlink points to
Target ACRank ACRank of Target URL
Source URL Source URL (backlink)
Source ACRank ACRank of Source URL
Anchor Text Anchor text used in linking, for images it will be text used in ALT="" part of the tag and for Mentions it will be text used (ie:
Source Crawl Date The date when backlink was last (most recently) crawled
Source First Found Date The date when backlink was first found on source page (historical data is more meaningful in this context because Fresh Index only covers last 30 days of crawl)
FlagNoFollow if set to + then source URL was marked as nofollow
FlagImageLink Indicates if the source URL was an image
FlagRedirect Indicates if the source URL was a redirect
FlagFrame Indicates if the source URL was used in FRAME or IFRAME
FlagOldCrawl Indicates if the source URL was found to be deleted (removed)
FlagAltText if set to + then source URL was taken from TITLE part of an A tag
FlagMention if set to + then source URL was actually text mention
SourceCitationFlow Citation Flow of the Source URL
SourceTrustFlow Trust Flow of the Source URL
TargetCitationFlow Citation Flow of the Target URL
TargetTrustFlow Trust Flow of the Target URL
SourceTopicalTrustFlow_Topic_0 Highest ranking trust flow topic for Source URL (if available)
SourceTopicalTrustFlow_Value_0 Highest ranking trust flow topic score for Source URL (if available)
TargetTopicalTrustFlow_Topic_0 Highest ranking trust flow topic for Target URL (if available)
TargetTopicalTrustFlow_Value_0 Highest ranking trust flow topic score for Target URL (if available)

Related commands

To see details on finding the files created by running DownloadBackLinks, please see the documentation regarding GetDownloadsList.

To see details on deleting this job, please see the documentation regarding DeleteDownloads.

To see details on how to obtain the cost of analysis, please see the documentation regarding GetIndexItemInfo.

Common problems

Problem Solution
Making requests to very large domains such as uses up resources very quickly. Use SkipIfAnalysisCostGreaterThan parameter to avoid analysing too large domains and/or call GetIndexItemInfo command to get analysis cost first.
Repeated calls to the same domain with slightly varied parameters can quickly use up available resources. Call this command once, then manipulate the data on your end.
Calls to this command yield little to no results. Ensure that your parameters aren't too narrow. In particular, check date range filtering: please consider that our main index is not updated every day.