Monitoring OLVM (Oracle KVM)

Hi,

Is there any suitable plugin or method to monitor the Oracle KVM environment?
I found the forum below, but the plugin mentioned won’t work inside checkmk versions older than 2.0. Any alternative for this plugin?

Using Ovirt plugin (DECOIT).
Checkmk Exchange

Thank you,
Benson Thomas

On the github page of DECOIT you can find a 2.0 and 1.6 version of the ovirt plugin.

1 Like

@andreas-doehler ,

Thank you.
I have deployed the Plugin on the OLVM Engine Host but to complete the configuration, how can I create the ovirt_plugin.cfg?
I am getting the below error while manually running the plugin,

*Traceback (most recent call last):*
*  File "/usr/lib/check_mk_agent/plugins/ovirt_plugin.py", line 408, in <module>*
*    main()*
*  File "/usr/lib/check_mk_agent/plugins/ovirt_plugin.py", line 380, in main*
*    config: dict = get_config(args.config_file)*
*  File "/usr/lib/check_mk_agent/plugins/ovirt_plugin.py", line 117, in get_config*
*    files_read = config.read(cfg_file)*
*  File "/usr/lib64/python3.6/configparser.py", line 697, in read*
*    self._read(fp, filename)*
*  File "/usr/lib64/python3.6/configparser.py", line 1080, in _read*
*    raise MissingSectionHeaderError(fpname, lineno, line)*
*configparser.MissingSectionHeaderError: File contains no section headers.*
*file: '/etc/check_mk/ovirt_plugin.cfg', line: 1*
*'admin@internal\n'*

config file details,
Username: USER@internal
Password: xxx
Engine FQDN: HOSTNAME
Engine URL: https://HOSTNAME/ovirt-engine

Thank you,
Benson Thomas

Any suggestions, Please?
What am I missing in this configuration file?

It would be good to see how your configuration file looks like.
Inside the Python plugin you need to deploy on the target system, you can find the description how the config file needs to look like.
On the start of the file [OVIRT] as the first line is expected by the config reader/parser.

It worked :slight_smile: … Thank you.

Modified the configuration file as,
[OVIRT]
Username: admin@internal
Password: xxx
Engine FQDN: HOSTNAME
Engine URL: https://HOSTNAME/ovirt-engine

Now I am able to run the plugin inside the client manually but the plugin and its output are not getting discovered inside checkmk.

I have stored the plugin inside,
/usr/lib/check_mk_agent/local/ovirt_plugin.py

our checkmk version,
Checkmk Managed Services Edition 2.2.0

what am I missing :frowning: .

Why local? It is a plugin and should be inside the plugins folder.

initially, I saved it inside the below location but I am not able to discover the service inside checkmk.

/usr/lib/check_mk_agent/plugins/ovirt_plugin.py

Agent plugin count is detected inside the Check_MK Agent check.

just an update.
currently, I have installed an agent only inside the Ovirt engine host. should I install the agent inside the KVM hosts?

I would first inspect the agent output.
What do you see there inside?

/usr/lib/check_mk_agent/plugins/ovirt_plugin.py

I am getting the correct output while manually running the plugin,

[root@olvmmi ~]# /usr/lib/check_mk_agent/plugins/ovirt_plugin.py
<<<ovirt_overview:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"api": {"product_info": {"instance_id": "4924c078-3c46-11ee-8bbe-005056a058c2", "name": "Oracle Linux Virtualization Manager", "version": {"build": "10", "full_version": "4.4.10.7-1.0.25.el8", "major": "4", "minor": "4", "revision": "0"}}, "summary": {"hosts": {"active": "3", "total": "3"}, "storage_domains": {"active": "2", "total": "3"}, "users": {"active": "1", "total": "1"}, "vms": {"active": "1", "total": "1"}}}, "global_maintenance": false}
<<<ovirt_datacenters:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"datacenters": [{"id": "477ebf4e-3c46-11ee-820a-005056a058c2", "version": {"major": "4", "minor": "6"}, "status": "up", "description": "The default Data Center", "name": "Default", "supported_versions": {"version": [{"major": "4", "minor": "6"}]}}]}
<<<ovirt_storage_domains:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"storage_domains": [{"data_center": {"name": "Default", "id": "477ebf4e-3c46-11ee-820a-005056a058c2"}, "status": "active", "name": "SR01-OLVM-8200-SSD", "id": "6146266e-da8c-4539-a6f7-ac3f3417837a", "external_status": "ok", "description": "SR01- OLVM-8200-SSD", "committed": "85899345920", "available": "1007169830912", "used": "91268055040", "warning_low_space_indicator": "10"}, {"data_center": {"name": "Default", "id": "477ebf4e-3c46-11ee-820a-005056a058c2"}, "status": "active", "name": "ZENOLVM-ISO", "id": "b948c768-e09f-4b0d-acbb-b296db30cbed", "external_status": "ok", "description": "OLVM-ISO", "committed": "12884901888", "available": "148176371712", "used": "12884901888", "warning_low_space_indicator": "10"}]}
<<<ovirt_clusters:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"cluster": [{"id": "47817b3a-3c46-11ee-9ef9-005056a058c2", "version": {"major": "4", "minor": "6"}, "description": "The default server cluster", "name": "Default", "data_center": {"id": "477ebf4e-3c46-11ee-820a-005056a058c2"}}]}
<<<<olvmmi01.>>>>
<<<ovirt_hosts:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"version": {"build": "100", "full_version": "vdsm-4.40.100.2-1.0.13.el8", "major": "4", "minor": "40", "revision": "2"}, "status": "up", "summary": {"active": "1", "migrating": "0", "total": "1"}, "type": "rhel", "name": "olvmmi01.", "libvirt_version": {"build": "0", "full_version": "libvirt-7.10.0-2.module+el8.7.0+21035+a8208c98", "major": "7", "minor": "10", "revision": "0"}}
<<<<>>>>
<<<<olvmmi02.>>>>
<<<ovirt_hosts:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"version": {"build": "100", "full_version": "vdsm-4.40.100.2-1.0.13.el8", "major": "4", "minor": "40", "revision": "2"}, "status": "up", "summary": {"active": "0", "migrating": "0", "total": "0"}, "type": "rhel", "name": "olvmmi02.", "libvirt_version": {"build": "0", "full_version": "libvirt-7.10.0-2.module+el8.7.0+21035+a8208c98", "major": "7", "minor": "10", "revision": "0"}}
<<<<>>>>
<<<<olvmmi03.>>>>
<<<ovirt_hosts:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"version": {"build": "100", "full_version": "vdsm-4.40.100.2-1.0.13.el8", "major": "4", "minor": "40", "revision": "2"}, "status": "up", "summary": {"active": "0", "migrating": "0", "total": "0"}, "type": "rhel", "name": "olvmmi03.", "libvirt_version": {"build": "0", "full_version": "libvirt-7.10.0-2.module+el8.7.0+21035+a8208c98", "major": "7", "minor": "10", "revision": "0"}}
<<<<>>>>
<<<<cvvsaolvmolvmm>>>>
<<<ovirt_vmstats:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"name": "cvvsaolvmolvmm", "type": "server", "statistics": [{"type": "integer", "unit": "bytes", "name": "memory.installed", "description": "Total memory configured", "value": "2147483648"}, {"type": "decimal", "unit": "percent", "name": "cpu.current.guest", "description": "CPU used by guest", "value": "0.46"}, {"type": "decimal", "unit": "percent", "name": "cpu.current.hypervisor", "description": "CPU overhead", "value": "1.4"}, {"type": "decimal", "unit": "percent", "name": "cpu.current.total", "description": "Total CPU used", "value": "1.9"}, {"type": "decimal", "unit": "percent", "name": "network.current.total", "description": "Total network used", "value": "0"}]}
<<<<>>>>
<<<<cvvsaolvmolvmm>>>>
<<<ovirt_snapshots:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"name": "cvvsaolvmolvmm", "type": "server", "snapshots": [{"date": 1692887980190, "snapshot_status": "ok", "snapshot_type": "active", "description": "Active VM", "id": "8890336f-9100-4a5e-95be-aada23e56cd0"}]}
<<<<>>>>
<<<ovirt_snapshots_engine:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"name": "cvvsaolvmolvmm", "type": "server", "snapshots": [{"date": 1692887980190, "snapshot_status": "ok", "snapshot_type": "active", "description": "Active VM", "id": "8890336f-9100-4a5e-95be-aada23e56cd0"}]}
<<<ovirt_compatibility:sep(0)>>>
@ovirt_version_info{"PluginVersion": "1.0.6"}
{"engine": {"instance_id": "4924c078-3c46-11ee-8bbe-005056a058c2", "name": "Oracle Linux Virtualization Manager", "version": {"build": "10", "full_version": "4.4.10.7-1.0.25.el8", "major": "4", "minor": "4", "revision": "0"}}, "datacenters": [{"id": "477ebf4e-3c46-11ee-820a-005056a058c2", "version": {"major": "4", "minor": "6"}, "status": "up", "description": "The default Data Center", "name": "Default", "supported_versions": {"version": [{"major": "4", "minor": "6"}]}}], "cluster": [{"id": "47817b3a-3c46-11ee-9ef9-005056a058c2", "version": {"major": "4", "minor": "6"}, "description": "The default server cluster", "name": "Default", "data_center": {"id": "477ebf4e-3c46-11ee-820a-005056a058c2"}}]}
[root@olvmmi ~]#

Please find below the output of the debug command,

cmk --debug -vvv --detect-plugins ovirt_plugin.py olvmmi

<<<ovirt_overview:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_datacenters:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_storage_domains:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_clusters:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
PiggybackMarker(hostname='olvmmi01.') / Transition HostSectionParser -> PiggybackParser
PiggybackMarker(hostname='olvmmi01.') SectionMarker(name=SectionName('ovirt_hosts'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='olvmmi02.') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='olvmmi02.') SectionMarker(name=SectionName('ovirt_hosts'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='olvmmi03.') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='olvmmi03.') SectionMarker(name=SectionName('ovirt_hosts'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='cvvsaolvmolvmm') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='cvvsaolvmolvmm') SectionMarker(name=SectionName('ovirt_vmstats'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='cvvsaolvmolvmm') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='cvvsaolvmolvmm') SectionMarker(name=SectionName('ovirt_snapshots'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
<<<ovirt_snapshots_engine:sep(0)>>> / Transition NOOPParser -> HostSectionParser
<<<ovirt_compatibility:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
No persisted sections

Nothing of these two outputs is the real agent output.
For your 2.2 agent you can use the command “cmk-agent-ctl dump” to get the real agent output.
On CMK server side you can get the same result with “cmk -d hostname”.

In your second output please don’t use the “–detect-plugins” switch.
You must get the same output also without this switch.

If you do a “cmk -vvII hostname”, where there some ovirt checks found?
If yes you only need to activate your newly found checks to also have these inside your monitoring.

hi @andreas-doehler ,

Thank you.
As mentioned i tried with cmk-agent-ctl dump and cmk -d hostname commands. In both these I could see details related to ovirt but still I am not able to discover the checks inside checkmk.

Please find below the output of cmk -vvII hostname.

OMD[pool]:~$ cmk -vvII olvmmi.
Discovering services and host labels on: olvmmi.
olvmmi.:
+ FETCHING DATA
  Source: SourceInfo(hostname='olvmmi.', ipaddress='192.168.158.130', ident='agent', fetcher_type=<FetcherType.TCP: 8>, source_type=<SourceType.HOST: 1>)
[cpu_tracking] Start [7fece3ccccd0]
Read from cache: AgentFileCache(olvmmi., path_template=/omd/sites/pool/tmp/check_mk/cache/{hostname}, max_age=MaxAge(checking=0, discovery=120, inventory=120), simulation=False, use_only_cache=False, file_cache_mode=1)
[TCPFetcher] Execute data source
Connecting via TCP to 192.168.158.130:6556 (5.0s timeout)
Detected transport protocol: TransportProtocol.TLS (b'16')
Reading data from agent via TLS socket
Reading data from agent
Detected transport protocol: TransportProtocol.PLAIN (b'<<')
Closing TCP connection to 192.168.158.130:6556
[cpu_tracking] Stop [7fece3ccccd0 - Snapshot(process=posix.times_result(user=0.010000000000000009, system=0.0, children_user=0.0, children_system=0.0, elapsed=1.3299999982118607))]
+ PARSE FETCHER RESULTS
<<<check_mk>>> / Transition NOOPParser -> HostSectionParser
<<<cmk_agent_ctl_status:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<checkmk_agent_plugins_lnx:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<labels:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<df_v2>>> / Transition HostSectionParser -> HostSectionParser
<<<df_v2>>> / Transition HostSectionParser -> HostSectionParser
<<<systemd_units>>> / Transition HostSectionParser -> HostSectionParser
<<<nfsmounts_v2:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<cifsmounts>>> / Transition HostSectionParser -> HostSectionParser
<<<mounts>>> / Transition HostSectionParser -> HostSectionParser
<<<ps_lnx>>> / Transition HostSectionParser -> HostSectionParser
<<<mem>>> / Transition HostSectionParser -> HostSectionParser
<<<cpu>>> / Transition HostSectionParser -> HostSectionParser
<<<uptime>>> / Transition HostSectionParser -> HostSectionParser
<<<lnx_if>>> / Transition HostSectionParser -> HostSectionParser
<<<lnx_if:sep(58)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovs_bonding:sep(58)>>> / Transition HostSectionParser -> HostSectionParser
<<<tcp_conn_stats>>> / Transition HostSectionParser -> HostSectionParser
<<<multipath>>> / Transition HostSectionParser -> HostSectionParser
<<<diskstat>>> / Transition HostSectionParser -> HostSectionParser
<<<kernel>>> / Transition HostSectionParser -> HostSectionParser
<<<md>>> / Transition HostSectionParser -> HostSectionParser
<<<vbox_guest>>> / Transition HostSectionParser -> HostSectionParser
<<<job>>> / Transition HostSectionParser -> HostSectionParser
<<<chrony:cached(1693069383,120)>>> / Transition HostSectionParser -> HostSectionParser
<<<local:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_overview:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_datacenters:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_storage_domains:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
<<<ovirt_clusters:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
PiggybackMarker(hostname='olvmmi01.') / Transition HostSectionParser -> PiggybackParser
PiggybackMarker(hostname='olvmmi01.') SectionMarker(name=SectionName('ovirt_hosts'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='olvmmi02.') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='olvmmi02.') SectionMarker(name=SectionName('ovirt_hosts'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='olvmmi03.') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='olvmmi03.') SectionMarker(name=SectionName('ovirt_hosts'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='cvvsaolvmolvmm') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='cvvsaolvmolvmm') SectionMarker(name=SectionName('ovirt_vmstats'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
PiggybackMarker(hostname='cvvsaolvmolvmm') / Transition NOOPParser -> PiggybackParser
PiggybackMarker(hostname='cvvsaolvmolvmm') SectionMarker(name=SectionName('ovirt_snapshots'), cached=None, encoding='utf-8', nostrip=False, persist=None, separator='\x00') / Transition PiggybackParser -> PiggybackSectionParser
Transition PiggybackSectionParser -> NOOPParser
<<<ovirt_snapshots_engine:sep(0)>>> / Transition NOOPParser -> HostSectionParser
<<<ovirt_compatibility:sep(0)>>> / Transition HostSectionParser -> HostSectionParser
No persisted sections
  HostKey(hostname='olvmmi.', source_type=<SourceType.HOST: 1>)  -> Add sections: ['check_mk', 'checkmk_agent_plugins_lnx', 'chrony', 'cifsmounts', 'cmk_agent_ctl_status', 'cpu', 'df_v2', 'diskstat', 'job', 'kernel', 'labels', 'lnx_if', 'local', 'md', 'mem', 'mounts', 'multipath', 'nfsmounts_v2', 'ovirt_clusters', 'ovirt_compatibility', 'ovirt_datacenters', 'ovirt_overview', 'ovirt_snapshots_engine', 'ovirt_storage_domains', 'ovs_bonding', 'ps_lnx', 'systemd_units', 'tcp_conn_stats', 'uptime', 'vbox_guest']
Storing piggyback data for: 'olvmmi01.'
Trying to acquire lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi01./olvmmi.
Got lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi01./olvmmi.
Releasing lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi01./olvmmi.
Released lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi01./olvmmi.
Storing piggyback data for: 'olvmmi02.'
Trying to acquire lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi02./olvmmi.
Got lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi02./olvmmi.
Releasing lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi02./olvmmi.
Released lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi02./olvmmi.
Storing piggyback data for: 'olvmmi03.'
Trying to acquire lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi03./olvmmi.
Got lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi03./olvmmi.
Releasing lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi03./olvmmi.
Released lock on /omd/sites/pool/tmp/check_mk/piggyback/olvmmi03./olvmmi.
Storing piggyback data for: 'cvvsaolvmolvmm'
Trying to acquire lock on /omd/sites/pool/tmp/check_mk/piggyback/cvvsaolvmolvmm/olvmmi.
Got lock on /omd/sites/pool/tmp/check_mk/piggyback/cvvsaolvmolvmm/olvmmi.
Releasing lock on /omd/sites/pool/tmp/check_mk/piggyback/cvvsaolvmolvmm/olvmmi.
Released lock on /omd/sites/pool/tmp/check_mk/piggyback/cvvsaolvmolvmm/olvmmi.
Received piggyback data for 4 hosts
+ ANALYSE DISCOVERED HOST LABELS
Trying host label discovery with: check_mk, checkmk_agent_plugins_lnx, chrony, cifsmounts, cmk_agent_ctl_status, cpu, df_v2, diskstat, job, kernel, labels, lnx_if, local, md, mem, mounts, multipath, nfsmounts_v2, ovirt_clusters, ovirt_compatibility, ovirt_datacenters, ovirt_overview, ovirt_snapshots_engine, ovirt_storage_domains, ovs_bonding, ps_lnx, systemd_units, tcp_conn_stats, uptime, vbox_guest
  cmk/os_family: linux (check_mk)
  cmk/device_type: vm (labels)
Trying host label discovery with:
SUCCESS - Found 2 host labels
+ ANALYSE DISCOVERED SERVICES
+ EXECUTING DISCOVERY PLUGINS (35)
  Trying discovery with: cpu_loads, kernel_performance, mssql_datafiles, kernel, uptime, tcp_conn_stats, diskstat, domino_tasks, md, chrony, windows_intel_bonding, df, mem_vmalloc, check_mk_only_from, local, checkmk_agent, lnx_if, cifsmounts, bonding, job, vbox_guest, docker_container_status_uptime, ps, mounts, mem_win, kernel_util, nfsmounts, mem_linux, cpu_threads, systemd_units_services, systemd_units_services_summary, systemd_units_sockets_summary, multipath, mssql_transactionlogs, systemd_units_sockets
  1 checkmk_agent
  1 chrony
  1 cpu_loads
  1 cpu_threads
  3 df
  1 diskstat
  1 kernel_performance
  1 kernel_util
  1 lnx_if
  1 mem_linux
  3 mounts
  1 systemd_units_services_summary
  1 systemd_units_sockets_summary
  1 tcp_conn_stats
  1 uptime
SUCCESS - Found 19 services

If you add the “–debug” to your discovery you should get an error.
The plugins are looking for a string

if string_table[0][0] != '@ovirt_version_info':

that is a little bit different in your output.
@ovirt_version_info{"PluginVersion": "1.0.6"}
Here it would be better to check with “startswith”.
I don’t see why this line exists inside the agent plugin output.
Without this “stupid” version line the plugin could so a json.loads and that’s it.

Inside the plugin are some “nice” bugs on the data output formatting. I think this is the case why nothing is found on your system.

I am not that good at coding. could you please suggest to me how to sort it :).

I would rewrite the plugin and the checks. But this is a little bit more work than only change one or two lines of code. In the end i would change the whole setup from agent plugin to special agent as this agent plugin is only querying a web API.

hi @andreas-doehler ,

I checked further and tried to import ovirt mkp extension package inside checkmk.
Now I am able to view ovirt checks inside checkmk console but found storage domain checks crashed,

Crash reports of these checks are as below,

@andreas-doehler ,

is this error related to the plugin or something else? how can i correct this trend range error?

Yes this message is related to the plugin. The used default parameters are not correct if used with the build-in function “df_check_filesystem_single”.
You need to keep in mind that this check was written for 2.0 and with 2.2 now there are some changes that you need to fix inside the plugin.
This df check problem is one.

@andreas-doehler Thank you for the update. I think same plugin for 2.2 is not available. In git I could see only for version 1.6 and 2.0 .