Project

General

Profile

Benchmarking C Software Verifiers With LDV Tools » History » Version 10

Alexey Khoroshilov, 11/25/2010 03:00 PM

1 1 Pavel Shved
h1. Benchmarking C Software Verifiers With LDV Tools
2
3 10 Alexey Khoroshilov
The core component of Linux Driver Verification tools is a verification tool for C language.  Static analysis of driver sources, instrumented with environment models and rules, are eventually passed to such verifier, which performs the most important part of the whole work.
4 1 Pavel Shved
5 10 Alexey Khoroshilov
No wonder that the speed and quality of the overall analysis heavily depends on how well and fast the verifier performs.  But the best verifier is yet to be find!  There are several static C verifiers, which perform differently, and without experimenting it''s not clear which one is the best for our purpose.  Moreover, verifiers usually contain a number of tweaks and configuration opportunities, which should also be explored.  
6 1 Pavel Shved
7 10 Alexey Khoroshilov
So one of the goals of LDV program is to build a framework that helps us in answering such questions, as:
8 1 Pavel Shved
9
* Which configuration of a verifier works faster on a test driver set?
10
* Which verifier checks more drivers within a given time limit?
11
* How would it affect verification, if we supply verifier with more memory?
12
 
13 7 Pavel Shved
We developed special components that gather, analyze, visualize and compare statistics of verification of a lot of different drivers.  The drivers can either be automatically fetched from a pre-assembled test set, or could be automatically extracted from the Linux Kernel.
14 2 Pavel Shved
15
h2. Visualization and comparison
16
17 3 Pavel Shved
Here''s a sample experiment.  We used a test set with some "interesting" drivers (fixed and unfixed versions of those, in which we used to find bugs), which is included in default shipment of LDV tools (it''s called @generic@) to run "CPAchecker":http://cpachecker.sosy-lab.org/ with different time limits.  Here are the results
18
19
!http://forge.ispras.ru/attachments/518/bench1.png!
20
21
From this table we could see that (actually, this table doesn''t show us anything interesting.  I''ll re-run tests with different configurations and re-create the table to demonstrate that results can be useful).
22
23 9 Pavel Shved
Now let''s see how many degradations or improvements were there.  First, select some test runs to compare (hitting radio button near the "basis", and checking some to compare against it):
24 3 Pavel Shved
25 5 Pavel Shved
!http://forge.ispras.ru/attachments/519/bench2.png!
26 3 Pavel Shved
27
And press the "Compare tasks" button to get to the comparison page:
28
29 4 Pavel Shved
!http://forge.ispras.ru/attachments/520/bench3.png!
30 3 Pavel Shved
31
Here we can see that our "basis" configuration dominates all others.  Indeed, some of the drivers that were detected as Safe or Unsafe in the "basis" configuration are not detected anymore, and are "Unknown" for other configurations we selected.
32 2 Pavel Shved
33
h2. How to use LDV for benchmarking
34 5 Pavel Shved
35
To use this benchmarking system you should first "download sources":/projects/ldv/wiki/Downloading_and_Building_LDV of LDV tools (we recommend downloading the master branch).  After downloading we recommend you to refer to file @VerifierDevelopersHowTo@ (see source:VerifierDevelopersHowTo).  It will tell you how you should install and use LDV tools in order to specifically address benchmarking verification tools.
36
37
Currently, only BLAST and CPAchecker verifiers are supported.  However, developing a convenient way to plug in other verifiers is on our roadmap.