.. _section-hawat-plugin-timeline: timeline ================================================================================ This pluggable module provides features related to `IDEA `__ event database timeline visualisations. It enables users to execute custom queries via provided search form and browse the search results in the form of various timeline visualisations. .. note:: Please be aware, that database search and subsequent statistical calculations and visualisations are extremely pricy in terms of computation power and time. In case you encounter errors or will not get any results you may need to adjust the search criteria to lower the amount of processed data. .. _section-hawat-plugin-timeline-endpoints: Provided endpoints -------------------------------------------------------------------------------- ``/timeline/search`` * *Reference:* :ref:`section-hawat-plugin-timeline-features-search` ``/api/timeline/search`` * *Reference:* :ref:`section-hawat-plugin-timeline-webapi-search` .. _section-hawat-plugin-timeline-features: Features -------------------------------------------------------------------------------- .. _section-hawat-plugin-timeline-features-search: Event timeline search ```````````````````````````````````````````````````````````````````````````````` **Relevant endpoints:** ``/timeline/search`` * *Authentication:* login required * *Authorization:* any role * *Methods:* ``GET`` ``/api/timeline/search`` * Underlying web API endpoint * *Reference:* :ref:`section-hawat-plugin-timeline-webapi-search` **Menu integration:** */ Dashboards / Timeline* This view provides access to the `IDEA `__ event timeline visualisations. It enables users to create complex database queries and browse the result in the form of various charts and tables. .. figure:: /_static/mentat-hawat-timeline-search-form.png :alt: Hawat: Timeline - Search event timeline Hawat: Timeline - Search event timeline Event timeline search form is designed to be as exhaustive as possible. There is a parameter for each event attribute that is indexed and searchable within the database. As a result full search form is quite large. To conserve space there is a button bar at the top of the form that can toggle visibility of various form sections only when they are needed. Note, that some of the form toggle sections are mutually exclusive. For example you may not search according to both detection and storage times. This form is very similar to :ref:`event database search form `, it however lacks certail features like pagination, sorting and result limiting, that are not aplicable to chart visualisations. .. _section-hawat-plugin-timeline-webapi: Web API -------------------------------------------------------------------------------- For general information about web API please refer to section :ref:`section-hawat-webapi`. Following is a list of all currently available API endpoints. These endpoints provide results as JSON document instead of full HTML page. .. _section-hawat-plugin-timeline-webapi-search: API endpoint: **search** ```````````````````````````````````````````````````````````````````````````````` **Relevant endpoint:** ``/api/timeline/search`` * *Authentication:* login required * *Authorization:* any role * *Methods:* ``GET``, ``POST`` The URL for web API interface is available as normal endpoint to the user of the web interface. This fact can be used to debug the queries interactively and then simply copy them to another application. One might for example start with filling in the search form in the ``/timeline/search`` endpoint. Once you are satisfied with the result, you can simply switch the base URL to the ``/api/timeline/search`` endpoint and you are all set. **Available query parameters**: Following parameters may be specified as standard HTTP query parameters: *Time related query parameters* ``dt_from`` * *Description:* Lower event detection time boundary * *Datatype:* Datetime in the format ``YYYY-MM-DD HH:MM:SS``, for example ``2018-01-01 00:00:00`` ``dt_to`` * *Description:* Upper event detection time boundary * *Datatype:* Datetime in the format ``YYYY-MM-DD HH:MM:SS``, for example ``2018-01-01 00:00:00`` ``st_from`` * *Description:* Lower event storage time boundary * *Datatype:* Datetime in the format ``YYYY-MM-DD HH:MM:SS``, for example ``2018-01-01 00:00:00`` ``st_to`` * *Description:* Upper event storage time boundary * *Datatype:* Datetime in the format ``YYYY-MM-DD HH:MM:SS``, for example ``2018-01-01 00:00:00`` .. warning:: All time related query parameters are by default expected to be in the timezone that the user may set in his or her profile. This could lead to unexpected bugs, so there is also possibility to use UTC timestamps in ISO format ``YYYY-MM-DDTHH:MM:SSZ``. By using this format you can be always certain about the time parameters and it is particularly usefull for example when generating URL programatically and without the knowledge of users timezone settings. *Origin related query parameters* ``source_addrs`` * *Description:* List of required event sources * *Datatype:* ``list of IP(4|6) addressess|networks|ranges as strings`` * *Logical operation:* All given values are *ORed* ``source_ports`` * *Description:* List of required event source ports * *Datatype:* ``list of integers`` * *Logical operation:* All given values are *ORed* ``source_types`` * *Description:* List of required event source `types (tags) `__ * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``target_addrs`` * *Description:* List of required event targets * *Datatype:* ``list of IP(4|6) addressess|networks|ranges as strings`` * *Logical operation:* All given values are *ORed* ``target_ports`` * *Description:* List of required event target ports * *Datatype:* ``list of integers`` * *Logical operation:* All given values are *ORed* ``target_types`` * *Description:* List of required event target ports * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``host_addrs`` * *Description:* List of required event sources or targets * *Datatype:* ``list of IP(4|6) addressess|networks|ranges as strings`` * *Logical operation:* All given values are *ORed* ``host_ports`` * *Description:* List of required event source or target ports * *Datatype:* ``list of integers`` * *Logical operation:* All given values are *ORed* ``host_types`` * *Description:* List of required event source or target ports * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* *Event related query parameters* ``groups`` * *Description:* List of required event resolved groups * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_groups`` * *Description:* Invert group selection * *Datatype:* ``boolean`` ``protocols`` * *Description:* List of required event protocols * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_protocols`` * *Description:* Invert protocol selection * *Datatype:* ``boolean`` ``description`` * *Description:* Event description * *Datatype:* ``string`` ``categories`` * *Description:* List of required event `categories `__ * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_categories`` * *Description:* Invert the category selection * *Datatype:* ``boolean`` ``severities`` * *Description:* List of required event severities * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_severities`` * *Description:* Invert the severity selection * *Datatype:* ``boolean`` *Detector related query parameters* ``detectors`` * *Description:* List of required event detectors * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_detectors`` * *Description:* Invert detector selection * *Datatype:* ``boolean`` ``detector_types`` * *Description:* List of required event detector `types (tags) `__ * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_detector_types`` * *Description:* * *Datatype:* ``boolean`` *Special query parameters* ``inspection_errs`` * *Description:* List of required event inspection errors * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_inspection_errs`` * *Description:* Invert inspection error selection * *Datatype:* ``boolean`` ``classes`` * *Description:* List of required event classes * *Datatype:* ``list of strings`` * *Logical operation:* All given values are *ORed* ``not_classess`` * *Description:* Invert class selection * *Datatype:* ``boolean`` ``aggregations`` * *Description:* List of statistical aggregations to perform, default is empty and will perform all * *Datatype:* ``list of strings ['categories','']`` * *Default:* ``[]`` (mentat.stats.idea.ST_SKEY_CATEGORIES, {}, {"aggr_set": "category"}), (mentat.stats.idea.ST_SKEY_IPS, {}, {"aggr_set": "source_ip"}), #('', {"aggr_set": "target_ip"}), (mentat.stats.idea.ST_SKEY_SRCPORTS, {}, {"aggr_set": "source_port"}), (mentat.stats.idea.ST_SKEY_TGTPORTS, {}, {"aggr_set": "target_port"}), (mentat.stats.idea.ST_SKEY_SRCTYPES, {}, {"aggr_set": "source_type"}), (mentat.stats.idea.ST_SKEY_TGTTYPES, {}, {"aggr_set": "target_type"}), (mentat.stats.idea.ST_SKEY_PROTOCOLS, {}, {"aggr_set": "protocol"}), (mentat.stats.idea.ST_SKEY_DETECTORS, {}, {"aggr_set": "node_name"}), (mentat.stats.idea.ST_SKEY_DETECTORTPS, {}, {"aggr_set": "node_type"}), (mentat.stats.idea.ST_SKEY_ABUSES, {}, {"aggr_set": "resolvedabuses"}), (mentat.stats.idea.ST_SKEY_CLASSES, {}, {"aggr_set": "eventclass"}), (mentat.stats.idea.ST_SKEY_SEVERITIES, {}, {"aggr_set": "eventseverity"}), ``limit`` * *Description:* Perform toplisting for address and port statistics * *Datatype:* ``integer [1..1000]`` * *Default:* ``100`` *Common query parameters* ``submit`` * *Description:* Search trigger button * *Datatype:* ``boolean`` or ``string`` (``True`` or ``"Search"``) * *Note:* This query parameter must be present to trigger the search Parameters ``page``, ``limit`` and ``sortby`` are not supported. **Search examples** * Default search query:: /api/timeline/search?submit=Search * Search without default lower detect time boundary:: /api/timeline/search?dt_from=&submit=Search **Response format** JSON document, that will be received as a response for the search, can contain following keys: ``aggregations`` * *Description:* This subkey is present in case search was successfull. It contains list of all aggregations that were actually performed. * *Datatype:* ``list`` * *Default*: ``["categories", "ips", "source_ports", "target_ports", "source_types", "target_types", "protocols", "detectors", "detector_types", "abuses", "classes", "severities"]`` ``form_data`` * *Description:* This subkey is present in case search operation was triggered. It contains a dictionary with all query parameters described above and their appropriate processed values. * *Datatype:* ``dictionary`` ``form_errors`` * *Description:* This subkey is present in case there were any errors in the submitted search form and the search operation could not be triggered. So in another words the presence of this subkey is an indication of search failure. This subkey contains list of all form errors as pairs of strings: name of the form field and error description. The error description is localized according to the user`s preferences. * *Datatype:* ``list of tuples of strings`` * *Example:* ``[["dt_from", "Not a valid datetime value"]]`` ``statistics`` * *Description:* This subkey is present in case search operation was triggered. It contains the actual result of the search. Following subkeys can be found in this dictionary: * ``count`` - Number of statistical datasets used to calculate this result dataset * ``dt_from`` - Lower time boundary of the result dataset * ``dt_to`` - Upper time boundary of the result dataset * ``timeline_cfg`` - Pre-calculated optimized timeline configurations Additionally there are following subkeys: ``abuses``, ``categories``, ``classes``, ``detectors``, ``detector_types``, ``ips``, ``protocols``, ``severities``, ``source_ports``, ``source_types``, ``target_ports``, ``target_types``. Each of these subkeys represents aggregation of events by particular attribute. Additionally there is a counter ``cnt_events`` (total number of original events), which is an integer. Finally there is a ``timeline`` subkey, which contains eveything that was described so far, but rendered to the timeline. * *Datatype:* ``dictionary`` ``items_count`` * *Description:* This subkey is present in case search operation was triggered. It contains the number of original datasets that have been processed to produce final dataset ``statistics``. * *Datatype:* ``integer`` ``query_params`` * *Description:* This subkey is always present in the response. It contains processed search query parameters that the user actually explicitly specified. * *Datatype:* ``dictionary`` * *Example:* ``{"dt_from": "", "submit": "Search"}`` ``time_marks`` * *Description:* This subkey is present in case search operation was triggered. It contains list of time marks that can be used to calculate the duration of various processing steps like queriing database, processing and rendering the result. * *Datatype:* ``list of lists`` ``searched`` * *Description:* This subkey is present in case search operation was triggered. It is a simple indication of the successfull search operation. * *Datatype:* ``boolean`` always set to ``True`` ``search_widget_item_limit`` * *Description:* This subkey is always present in the response. It is intended for internal purposes. * *Datatype:* ``integer`` ``view_icon`` * *Description:* This subkey is always present in the response. It is intended for internal purposes. * *Datatype:* ``string`` ``view_title`` * *Description:* This subkey is always present in the response. It is intended for internal purposes. * *Datatype:* ``string`` **Example usage with curl:** .. code-block:: shell $ curl -X POST -d "api_key=your%AP1_k3y" "https://.../api/timeline/search?submit=Search" .. note:: Please be aware, that this endpoint provides raw statistical dataset. The :ref:`section-hawat-plugin-timeline-features-search` endpoint performs some additional client-side processing with JavaScript to transform datasets provided by this endpoint into format appropriate for chart and table rendering.