DEPARTMENT OF COMPUTER SCIENCE
Dr. Jens Krinke
My areas of interest are Software Test, Continuous Integration, Code Review, Program Analysis, Clone Detection, Bug Detection. If you are interested to do a project in one of the areas, please contact me.
A list of concrete topics you'll find below.
Does Code Readability impact Code Review?
One of the aims of Code Review is to maintain and improve the readability of code. However, readability is very subjective and hard to automatically measure. Buse and Weimer have made an attempt to define a readability metric to measure the readability of code.
In this project, the Buse/Weimer metric will be used to measure the readability of code that undergoes code review and a statistical analysis will be applied to identify wether there is a statistical difference in readabilioty of code before and after code review.
Collecting Fault Correcting Patches for Integration into Code Review
This project focusses on the extraction and collection of committed patches that correct faults (bugs) and building a database of such patches that can be automatically compared to newly created changes (patches) during code review. To do so, first patches need to be etxtracted from a set of software repositories. Then an approach to identify fault correcting patches needs to be developed, so that the extracted patches can be labelled accordingly. This requires research into existing techniques and development of a technique that is able to label fault correcting patches with a high precision. In the last step, a sequence of changes submitted for code review (the CROP data set) is replayed and for every changed method, a search for similar fault correcting patches in the constructed database is performed.
Are Developers Aware of Clones When They Make Code Changes?
This projects aim to leverage the rich information in code review process and automatic code clone detection to assess the awareness of developers to code cloning on a day-to-day basis. Code clone detection will be performed before and after each code review patch is submitted. With detected clones, code changes from the patch, and natural language text in the review, we can ask several interesting questions by performing a manual analysis of the results: (1) How often are developers aware, i.e. do they discuss them in code review, when clones are introduced into the systems? (2) When they discuss about clones, what happens to the clones in the system after the discussion? (3) What are common intents when developers introduce clones into the software? (4) What is the developer's perception about code clones? Should clones always be removed or it's just good to know they're there?
Cross-Referencing for Deletion Dependence
In the ORBS Project, programs are manipulated by deleting statements and source code lines as much as possible. Very often, a deletion is attempted, but the resulting source code cannot even be compiled. This project will build a referencing tool that will extract links between lines from the analysed source code. These links will establish that a line x can only be deleted when a line y already has been deleted. The extracted links will be used to improve the ongoing ORBS research project.
Code-to-Test Traceability via Co-Change Analysis
Code-to-test traceability establishes links between code and the tests testing the code. Establishing such links is not straight forward and multiple approaches have been developed. This project will have to establish such links by analysing the change history of projects. If a method and a test is added or changed at the same time, it is very likely that the method is tested by the test. To establish such relationships, this project will have to develop a tool that automatically analyses the commit history of a project, extract all added or changed methods, and then to do a data mining approach to extract the co-changed test-method pairs.
Last modified: 05/11/2020