Monday, November 26, 2018 — 1:40 PM EST

Candidate: Edmund Wong

Title: Improving Software Dependability through Documentation Analysis

Date: November 27, 2018

Time: 12:30 PM

Place: DC 1331

Supervisor(s): Tan, Lin

 

Abstract:

Software documentation contains critical information that describes a system's functionality and requirements. Documentation exists in several forms, including code comment, test plan, man page, and user manual. The lack of documentation in existing software systems is an issue that impacts software maintainability and programmer productivity. Since some code base contains a large amount of documentation, we want to leverage these existing documentation to improve software dependability. Specifically, we improve both a system's reliability (e.g., failure-free operation) and maintainability (e.g., ease of understanding) using documentation.

In this thesis, we analyze software documentation and propose two branches of work, which focuses on three types of documentation including man page, code comment, and user manual. The first branch of work focuses on documentation analysis because documentation contains valuable information that describes the behavior of the program. We study the constraints from documentation and apply them on a structured-file parsing application, and extract constraints automatically from documentation and apply them on a dynamic analysis symbolic execution tool. The second branch of work focuses on code comment generation because documentation can be scarce and outdated in practice.

For documentation analysis, we propose and implement DocRepair and DASE. DocRepair performs a case study to study and repair corrupted PDF files. We create the first dataset of 319 corrupted PDF files and conduct an empirical study on 119 real-world corrupted PDF files to study the common types of file corruption. DocRepair's repair algorithm includes seven repair operators that utilizes manually extracted constraints from documentation to repair corrupted files. We evaluate DocRepair against three common PDF repair tools. Amongst the 1,827 collected corrupted files from over two corpora of PDF files, DocRepair can successfully repair 354 files compared to Mutool, PDFtk, and GhostScript which repair 508, 41 and 84 respectively. We also propose a technique to combine multiple repair tools called DocRepair+, which can successfully repair 751 files.

DASE leverages automatically extracted constraints from documentation to improve a dynamic analysis symbolic execution tool. DASE guides symbolic execution to focus the testing on execution paths that execute a program's core functionalities using constraints learned from the documentation. We evaluated DASE on 88 programs from five mature real-world software suites to detect software bugs. DASE detects 12 previously unknown bugs that symbolic execution would fail to detect when given no input constraints, 6 of which have been confirmed by the developers.

For automated documentation generation, we propose and implement CloCom and AutoComment. We implement CloCom to generate code comments by mining existing software repositories in GitHub. We implement AutoComment to generate code comments by mining a Question and Answer site, Stack Overflow. CloCom and AutoComment generate 181 comments and 144 comments respectively for 15 Java projects.

Location 
EIT - Centre for Environmental and Information Technology
Room 3145
200 University Avenue West

Waterloo, ON N2L 3G1
Canada

S M T W T F S
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
  1. 2020 (1)
    1. January (1)
  2. 2019 (277)
    1. December (11)
    2. November (32)
    3. October (19)
    4. September (26)
    5. August (26)
    6. July (40)
    7. June (24)
    8. May (23)
    9. April (35)
    10. March (25)
    11. February (9)
    12. January (10)
  3. 2018 (150)
    1. December (13)
    2. November (25)
    3. October (12)
    4. September (13)
    5. August (7)
    6. July (23)
    7. June (9)
    8. May (6)
    9. April (9)
    10. March (16)
    11. February (10)
    12. January (7)
  4. 2017 (212)
  5. 2016 (242)
  6. 2015 (242)
  7. 2014 (268)
  8. 2013 (192)
  9. 2012 (31)