Edwards, StephenSpacco, JaimeHovemeyer, David2019-01-032019-01-032019-01-08978-0-9981331-2-6http://hdl.handle.net/10125/60221Static analysis tools evaluate source code to identify potential problems or issues beyond typical compiler errors. Prior work has shown a statistically significant relationship between the correctness of a student's work and statically identifiable flaws or "code smells" that are likely to indicate programming errors. This paper presents a comprehensive study of this relationship in the context of small programming exercises intended for use in student skill building. We use FindBugs, a static analysis tool that identifies program features that are likely to represent actual bugs in professional software. Our goal is to identify the extent to which FindBugs warnings might help novices struggling to solve short programming exercises. In this study, we ran FindBugs against 149,054 answers submitted by 516 students on 57 drill-and-practice coding exercises. We identify the specific FindBugs warnings that are inversely correlated with correctness. We confirm that presence of these warnings is significantly associated with struggling on an exercise, as indicated by taking more time, making more submissions, and receiving lower scores. Finally, every exercise exhibited answers that trigger these warnings, and 92.4% of students would experience these warnings over a full semester. Our results indicate that static analysis with tools designed for use in industry offers an untapped opportunity to provide hints or suggestions to students who are measurably struggling.10 pagesengAttribution-NonCommercial-NoDerivatives 4.0 InternationalVision and Novel DeliverySoftware Engineering Education and TrainingCS Educationstatic analysisdrill-and-practiceerrorsbugshintsCodeWorkoutFindBugsCS1CS2Can Industrial-Strength Static Analysis Be Used to Help Students Who Are Struggling to Complete Programming Activities?Conference Paper10.24251/HICSS.2019.941