WHAT I HAVE LEARNED IN TWENTY YEARS OF TEACHING THE SOFTWARE PROJECT COURSE
Dr. William Mitchell
LSU-Shreveport
wmitchel@pilot.lsus.edu
ABSTRACT
The author has been teaching the software engineering project course for nearly two decades, and over that interval both the understanding of the discipline and the goals for an under-graduate introduction have evolved.
Today more than ever there is the realization that the engineering aspects of developing software products are difficult to teach within computer science curricula. The author observes that teaching about process and its relationship to product has become the focus of his course. This paper traces the evolution of the course and describes the structure that it now exhibits.
PROGRESSION OF COURSE GOALS
In the late 1970s I began teaching a year-long software engineering project course which was the capstone course in a CS/CIS major within a College of Engineering. To coincide with the engineer's senior project this course required students to individually demonstrate that they could execute all phases of software development. In the fall quarter the student conducted a systems analysis ending with a functional specification for a project. During the winter quarter a design for a solution was derived, culminating with a detailed design document. In the spring quarter the student implemented and documented the product, testing as time allowed. As much as possible the project was for an off-campus client and otherwise for a user elsewhere in the university. The language was often PL/C and later Pascal but depended upon the client's environment. The specific prerequisites for the senior project were successful completion of the major's programming core, a sophomore course in structured systems analysis, and a junior course (team-oriented) in structured design. The project course itself was conducted as an independent study. This pattern was employed with little change for a decade but with a steadily rising concern that more about software engineering needed to be presented.
My arrival at LSU-Shreveport provided the stimulus to look at a new way to teach the project course. In 1989 I was asked to teach a six semester hour course to 16 students as a transition from a one semester course requirement to a two semester course requirement (we met two nights a week, three hours a night). Having been impressed by the discussions in the literature concerning redundant & competitive programming in sub-groups [1, 2], I divided the class into four teams and I had the teams exchange work products through the semester (you designed someone else's specification and implemented someone else's design). I chose the product, a church accounting system, I specified that it would be coded in Turbo Pascal using the database toolkit, and I used a software engineering text for the first time. Very few of the students had developed programs of more than 500 lines and most had no previous experience working in teams. These students were also unfamiliar with analysis or design methodologies. Skirting the latter ignorance by playing the role of both the user and architect myself, I focused on introducing students to team programming, permitting them to experience the pressures of developing software under deadlines, and educating them about the software project development life-cycle from start to finish. I emphasized delivering working software, tested and documented, my traditional goals, but I also focused on the process of coordinating the work of many team members. For the first time I used a software engineering textbook.
The product failed integration (the user interface team and the database team did not get the details of their interface right) so the test team had little to do. One senior told me later that he could have easily completed the whole project by himself, but that he had been frustrated by having to communicate with and depend upon so many others (less capable than himself). I assured him that he had learned a great deal about software engineering, especially about its difficulty.
Through the early 1990's the two-semester software project course at LSU-Shreveport was structured around the production of a series of documents, but it contended with the fact that most students when they entered the project sequence had dealt only with small, discrete Pascal programming assignments [3]. We introduced a CASE tool in the first semester and developed the functional specification of a project as a collaboration of two or three students while surveying the spectrum of software engineering topics presented in the textbook. I tended to teach the second semester of the sequence where I picked up with design and continued through implementation with little further reference to the text.
In this environment I chose to continue the focus on the process of software engineering as it is practiced in teams [4]. At the start of second semester I would close down half the projects and reassign students to form teams of four to six. This was traumatic for many students and I experiment with several ways of accomplishing it, ranging from autocratic to democratic. In order to teach how to function in small teams I adapted a role-based methodology from the work of Larry Constantine (summarized by Marc Rettig [5]). My grading emphasized a team grade for product and gave significant weight for peer evaluation, and I continued to impress upon students the need for documentation at every stage. Each role was responsible for a document, each team participated in several presentations and reviews, and the year ended with a public project presentation.
At the start of the second semester the new teams were given the following role descriptions and decided among themselves who would assume which one. These roles are functional in nature, not management responsibilities (I presented the concept of "leaderless teams" but I did not worry about who ran the team). I emphasized that all team members would program and would assist in the development and presentation of all the documents.
Constantine proposed that members of a small development team rotate roles over the duration of a project, but his suggested term of office was roughly a semester, so I focus on role interactions to assure that everyone learns the issues involved in each role. Individuals who develop QA or testing plans pass them on to others to apply. The Software Maintenance Document is a compilation of material assembled first semester integrated with the results of refinng and testing those plans. It includes anything which would serve to orient and assist the maintenance programmer, including test data for regression testing. The documenter and code librarian contribute significantly to the SMD.
In the late 1990's I have become responsible for both semesters of the software project course and have reassessed the course objectives once again in the face of the changing practices in the industry both locally and globally [6]. While continuing to utilize the methods and structures that have proved themselves through the decade, I have established the following priorities:
Concluding that the project course is too intense to bear the additional responsibility for introducing tools and languages, the department a few years ago made either the RAD or the DBMS course prerequisite to the project course. This is to insure that all students will already have some familiarity with a pc development environment (Access, Delphi, or Visual Basic). I have been left to address the unequal distribution of work between the two semesters and the difficulty that larger teams have in collaborating (given that LSU-Shreveport is a commuter campus for working students).
THE REQUIREMENTS OF THE CURRENT COURSE
The course design which will be employed into the new millenium is captured in a series of documents which are developed on a rigorous schedule. The first class meeting emphasizes how software engineering differs from the programming experiences previously encountered (although we continue to look for ways to build more project experience into lower-level courses). The students fill out a form that inventories their software development experience and knowledge. I conclude the first week's overview of the academic aspect of the course with the assignment to form pairs and begin to investigate a possible project that will meet certain criteria. This investigation results in both a class presentation and the submission of a formal proposal document during the third week of the course. The purpose of the exercise is first to generate a group of feasible projects from which the class can select those to pursue; second, to force consideration of what constitutes a feasible project; and third, to categorize the candidate projects by their instructional potential (a reminder that our goal is to learn about software engineering, not build a piece of software). The sections of the product proposal require students to evaluate these project characteristics and are described to the students as follows:
Page 1
1. Authors
2. Title of software application
3. Possible local client(s)
4. Description of Need which application would meet
Page 2
5. Description of the functions which the software would implement
!
List and describe each function and nature of the data manipulation it performsPage 3
6. Description of the environment in which the software would operate
!
Hardware platform, and other accessories (printer, network, etc.)!
Other software with which the application interacts, (windows, dos, DB, WP, etc.)!
Data which must be managed by the application or a helper application7. Discuss the tools needed to implement the software
Page 4
8. Identify the skills needed by the development team
!
Give job descriptions of the desired roles and time estimates (consider both semesters)Page 5
9. Describe the risks involved in the project (the unknown areas or the known difficulties)
10. List sources of possible information that should be referenced to further explore feasibility.
Page 6
11. Describe what would be learned in accomplishing the construction of this software
During the second week the students prepare and submit resumes, an exercise that remind them of the human resources available for the projects. At the start of the third week, in order to introduce the projects to the class and the students to each other, the various candidate projects are presented in class according to the following format:
In the next class following the presentations students submit an assessment of the project pool which consists of a rating table. Each candidate project has a column and each is rated using a 1 to 10 scale on the perceived difficulty of the project, the development environment projected for the project, the degree of "from scratch" coding which the project requires, the complexity of the user interface, and the proposed organization of the team. In addition, each student rates his/her personal interest in participating in each project (most rating their own project the most interesting). I compile these ratings and discuss them with the class the next day. I then suggest which are the most feasible candidates (merging my own and the student evaluations) and ask the students to bid on which of the selected projects they would prefer to work (first choice and alternate). The following class period I announce the projects and five or six person teams which will pursue them, making my decision both on the resumes and the rating preferences.
The subsequent week the newly-formed teams each submit a prototype description document and begin work on a requirements document. The prototype is used to clarify functional specifications and to focus the team on the characteristics of the product it proposes to develop. The document has the following sections:
1. Project Title, Team Name, Team Members
2. Product description (1 page) suitable for use on the box that sells the software
3. Architectural Diagram which describes the logical (functional) components of the product
4. Menu tree which describes the access paths to the functions
5. Assigned duties of each team member for the next thirty-day period (these duties will encompass (a) development of the user interface prototype (to be demonstrated in thirty days), (b) the development of the requirements document (due in three weeks), and (c) development of acceptance tests for the product.
This document serves to introduce the team to parallel activity, to the concept of scheduling, and to the importance to quality assurance. The lectures and the textbook have explained the requirements definition process so the specifics of the requirements document are not revealed now. Instead, the teams work for two weeks on collecting the information that they believe will be required. A week before it is due the following specification of the Requirements Document is distributed:
1. Title page: Project Name, Team Members, Date
2. Table of Contents
3. Systems Context (describe the activities of the user which will be automated and the way the automation will interact with the boundary non-automated activities, emphasizing the changes which the user will have to make in his/her present routines to accommodate the new system).
In appendix A assemble any forms which are in use in the present system that will provide input for or which will be produced by the new system. In appendix B include a description of the user=s current information system, highlighting the components which will be incorporated into the new system (section 3 describes the NEW system while appendix B describes the OLD system).
4. System Expectations (list the services which the user expects the system to provide by category of service, listing the most essential category first, and then listing the services within each category in order of priority (mandatory, practical, nice). Each service will be described by enumerating its specific functions, its value to the user (what alternative does the user have to satisfy this need--the service is either a new service or a substitute (improvement) for an existing process), and any non-functional requirements which are associated with that service).
[Categories correspond to the components of the functional architecture, i.e. Reports; specific services are billing reports, lab reports, etc.; and functions are report types or formats.]
5. A data-flow diagram of the proposed system. This should be a multi-level diagram which illustrates every service and every function mentioned in section 4.
6. A data dictionary of all terms (data items, data collections, functions, services) used in the document or its appendices.
7. A list of quality attributes of the product (those characteristics which the user would admire in the product) and the corresponding metric(s) and the manner in which they could be employed to determine the quality level of the product. These attributes would include characteristics of the user interface important to the user, the organization and coherency of the product which make it easy to navigate, the performance characteristics, the scope and capacity of the product, the flexibility and extensibility of the product, and its fit into the present context.
After the prototypes have been presented (and compared to the requirements submitted a week earlier) the teams turn to producing a formal specification document. I critique each of the team reports in writing and raise questions of completeness and appropriateness. Because the whole class has participated in the same exercises and analyzed the same projects, I return these critiques to everyone. I also emphasize that the work of the semester is cumulative. The material collected for each report is reused in varying degrees and forms in the next report. Therefore the Functional Specification is not produced de novo, but refines the Requirements Document and incorporates the GUI prototype. The focus, moreover, continues to be a description of WHAT the product will look like to the user. So much detail is required that the teams do not have much time to anticipate HOW they will accomplish the implementation. The Functional Specification is submitted two weeks after the prototype demonstration and seeks to freeze what must be delivered in the product, separating it from what may optimistically be delivered or left for the next release. The document includes the following sections:
1. Title page, table of contents
2. Team members and what contributions each has made to the team=s effort.
3. Description of the product's end user, focusing in on the need which this product will meet. [Although you may identify and describe the specific user who motivates the project, the product should not be viewed as uniquely customized (in-house), but should be abstracted to a generic user of the type your user represents.]
4. A data flow diagram of the system of activities in which your user is engaged, which includes all of the entities which provide or consume data or reports which your product will process. On this diagram will be drawn the automation boundary, the circle which encloses the functions which your product will perform. This boundary could be extended to include some adjacent manual process, or retracted to require that some process be done manually instead of by the computer. At the boundary where data input occurs, you need to indicate what processing (what decisions), if any, the data entry person must perform with the data to convert it from the format it has (usually on a form) in order to enter it into the computer.
5. An architectural diagram of the product in which its internal components and their interactions are shown.
6. A representation of the user interface which will show the menu tree of the GUI abstractly, listing the functions which can be performed on each screen (the screen layout is arbitrary or inconsequential. If your menus are all pull-down or activated from the tool bar, the main menu is the root and each button starts a branch of the menu tree).
7. A list of the logical or user functions (each named) accessible from the GUI, one or two per page, indicating the preconditions, the needed information on hand (that the user is thinking about and using while performing the function), the sub-functions, if any, which must be performed by the program as the user enters the steps of the process [the menu tree should permit the user to select the processing which is to be done. This function, from the user=s point of view should be a whole module (perhaps one of a short sequence of modules, each with clear boundaries), but from the program=s viewpoint, several distinct processing activities might be triggered by the module], WHAT (not how) the logical function accomplishes, what the post condition is (how has the state of the program/data base changed as a consequence of the execution of this function), and what the user sees when the function processing is complete (there may be different ways in which the function terminates, depending on error conditions or user intervention). [In the design document you will break these user functions down into named program modules.]
8. A description of the reports which will be generated by the program (any printed output).
9. A list and description of the non functional requirements which the program will satisfy, including size and speed capacities of the host pc or network, security, and backup provisions.
10. A Summary of how the requirements have been clarified during your feasibility study, including especially an explanation as to how the product description has changed over the first two months as represented by previous hand-ins. This Summary will also highlight any continuing areas of ambiguity and possible trouble spots in the current specifications. Include a list of program functions which are not presently planned for the initial product, but which might be desirable.
11. A glossary of terms (user and computer) which might be unfamiliar to a user or a programmer working on this product.
12. The Appendix will contain a statement from the user indicating that he has certified the completeness of the product as represented by the prototype. (Your previous definition document will be added to the Appendix also, so you might wish to augment or improve the materials you submitted earlier).
13. This list is minimal. You may wish to include additional sections or expand upon the material in any section described above. Do not be concerned with design details, however (answers to HOW processing will be done internally).
In the final two weeks of the semester following the submission of the Functional Specification each team produces their design document and project schedule for the following semester. During the process the critique of the specification document is received and the survey of the software engineering provided by the textbook is completed. The earlier documents have served to satisfy the user that the right product was being planned and the designer that the necessary functionality was understood. The design document focuses on the information needed by the programmer who will work individually on various parts with the document providing all the needed guidance to insure that the parts will fit together. The design document is, therefore, more tabular than narrative. Although a great deal of information must be tabulated, much of it is available in previous documents and can be assembled in parallel. While teams are aware that next semester they will be required to produce a test plan, user documentation, and a maintenance manual, the schedule focuses exclusively on the sequence of module implementations. The teams are unaware that a different set of milestones will be imposed upon them at the start of the next semester.
The Design Document must at least four sections and Appendices:
Section 1: Tabulation of components. The implementation should be designed to engage the whole team in parallel work, and therefore the design consists primarily in breaking the product into code components that must be constructed by the implementation team. The breaking process may closely or vaguely follow the architectural diagram that described the product in terms of subsystems. Anticipate that there will be on the order of 30-50 code components (modules) counting each screen and assuming that each screen interacts with several other components, many of them custom designed to perform the screen=s function. The tabulation entry for a component will give the component a name, provide a sentence description of the component's purpose, list its inputs and its outputs (its interfaces), and label its risk. The labels will be STRAIGHTFORWARD, COMPLEX, AND UNCLEAR (the algorithms or internal data formats for this component are not yet well understood). However the components are listed (alphabetical by name is recommended), the component description is intended to suggest Abottom-up@ assembly of the product. Only the components that must be built and those which exist but must be substantially modified appear in the list. There will be other components, for example dialog boxes, which will come with the programming language environment (section 3).
Section 2: Data descriptions. Two tables must be constructed, the first of data items (fields whose values are input or calculated and which are displayed on the printer or the screen), and the second of data aggregates (database tables). A data item is named, its internal and external representation indicated, and whether it is isolated or part of an aggregate. Occasionally a data item will be replicated through several variables, and in that case it has a list a aliases. Sometimes the data item changes internal representation and other times its displayed representation is different from its internal representation. Such circumstances should be flagged. Data items and aggregates are listed alphabetically by name. NOT INCLUDED in the list are internal counters, flags, temporary storage locations, etc. A data aggregate is named by its internal file or table name, and shows the record structure of the aggregate, both data fields and association fields (links, keys, etc.). Each aggregate has a repetition factor (the number of rows, the number of records, etc.)
Section 3: Implementation language and tools. The components provided by the language or its libraries are listed here (usually GUI components and data base services). An assessment of the benefits of using this language for implementation follows (show how and where it will ease the implementation) including a discussion of any drawbacks (such as size of the resulting code, speed, licensing, etc.)
Section 4: Schedule: In tabular form list the modules to be constructed (section 1), the effort in person days estimated in producing and testing the module, and the precedence order of the module (what components will already be available to help in testing the module). This listing is most naturally produced by enumerating first those modules without dependence on others, then those which depend on those listed, until all components have been accounted for. You should also schedule Adummy@ activities that take up time to integrate components into subsystems and insert those activities in the list. Finally, add two columns in which you specify a start date and a finish date (delivery of the tested component, or completion of integration of the subsystem) for each entry. These dates should all be in the interval from the first day of the second semester to the first day of the last month of the semester, but could start sooner. These dates should be chosen to conform to the dependencies and to the estimated effort (recognizing that no one has whole person days to devote to implementation).
The schedule table will therefore have the form:
ID Module/Activity Name Effort (PD) Prereq. Module ID
=s Start Date Delivery Date__ _____________ _______ ___ _____ ______
__ _____________ _______ ___ _____ ______
Appendices:
A. Examples of every input form associated with the new system
B. Examples of every report produced by the new system.
C. Schematics of all screens (in tree order (pre-order traversal)).
The first semester's grade is based 40% on two exams, 25% on the final, 10% on presentations made, 15% on the quality of the reports delivered, and 10% on peer evaluations (based on multiple peer assessment surveys usually conducted on the days reports were due).
At the start of the second semester the teams receive my critique of their design documents and a list of milestones they must meet in the second semester. They are also informed that second semester's grade will be comprised 30% by the quality of the software product delivered at the end of the semester, 20% by peer evaluations, 20% by the final exam, and 30% by the reports submitted. As in the past, the roles previously described are distributed and accepted. Among the first of the new milestones are to within one week to correct the deficiencies of the design document and within an additional week to revise the detailed work assignments and to enumerate a set of coding standards and other QA practices which the team will follow. Eight calendar weeks into the semester the product test plan, the user documentation and a partial maintenance manual are to be submitted. During this interval I conduct a code walkthrough with each team and give them procedures to obtain approval for specification changes. The emphasis, however, is that they are to build to specification and the early delivery of documentation based on the specification is intended to counteract the desire for creeping improvements. . Most of the first month is spent in revising and testing the initial design. All of the modules previously listed as COMPLEX or UNCLEAR must be clarified. Students begin to explore their coding assignments and usually uncover ambiguities and inconsistencies in their specifications. Ten weeks into the course the alpha product is demonstrated, followed in a week by the preliminary test reports, which measure how good the alpha versions are, and revised schedules which focus on the best use of the remaining time. Three weeks after the alpha demo is a beta demo (last year the beta demo was scheduled as the program of the local AITP professional chapter). Two weeks following the beta demo, on the last day of class, the teams do a final project presentation, accompanied by the submission of the final test report, the final user documentation, and the project legacy document.
The alpha and beta demos focus on showing the software product's functionality and on the extent and effectiveness of the user documentation. The final presentation therefore focuses on an assessment of the project experience. For the introduction the requirements from the first semester are reviewed (WHAT was to be delivered at the end of the project). Then any design problems that were encountered in meeting these specifications are enumerated, including any Aadjustments" that were made to the specifications in the process of implementation. Next the team demonstrates the Auser friendly" characteristics of the product make the product efficient to use. This part highlights specific aspects of the GUI, the on-line help, and the written documentation that users will notice and appreciate. The next portion of the presentation is a report on the quality assessment of the product that describes the testing it has undergone and identifies its current limitations and what needs to be done in the future to improve performance and functionality. As a summary of the legacy document, there follows a tabulation of what it has cost to produce this product in terms of team effort, equipment, and software tools. This begins with a description of physical characteristics of the product (how much code, how many pages of documentation, etc.) and continues with a display of overall measures of productivity. The project's scheduling difficulties are reviewed and lessons about estimating are derived. The presentation is completed with a "what happens now" part that projects the future of the product (what circumstances are required for there to be a future) and describes the maintenance manual so that the audience has a complete picture of what is being delivered.
During the second semester the document contents and form are not specified, but their organization and content are critiqued immediately after submission and they can be improved and re-submitted. The maintenance manual is described in general terms as an augmentation of the design document which contains all information of use to the maintenance programmer in either repairing or enhancing the software product. The preliminary maintenance manual comprises all the revisions and updates of the original design document and a table of contents that identifies sections to be added after the product is tested.
CONCLUSION
The structure of the one-year project course described above has evolved over a two-decade period in response to pressures enumerated here and elsewhere and it expresses a personal balance between product and process. The structure says little of the tools that may be employed in the course or the choice of languages or design strategies. In the most recent cycle students used Visual Basic and Access with a little use of EasyCase and Visual SourceSafe. The choice of six person teams is arbitrary, but is motivated by the fundamental roles and the amount of effort required to accomplish practical products. Last year one team lost two members but was still able to deliver a (reduced) product. Larger teams provide a more realistic introduction to the communication problems and the documentation needs of software projects and motivate an understanding of standards. My preference for implementing real projects for external users is rooted in the superior motivation that such projects provide and the urban circumstances which provide a multitude of opportunities. This does, however, tend to bias projects toward end-user applications.
It should be no surprise that the schedule and the implementation tools cause the student the most difficulty. The documents can be specified and illustrated (using student work from past years) clearly enough, but students seldom have enough mastery of their implementation language to avoid design and scheduling errors. A single course introduction to Visual Basic or Access has not provided sufficient breadth or depth of experience to preclude serious mis-conceptions about the environment's capabilities (in previous years when the project was implemented in the language of the major's programming core the students had sufficient foundation for estimation). Inexperience in working in teams also exacts a cost in the second semester despite the training of the first semester. The prototype and documentation activities do not require a great deal of organization or technical skill. The implementation of all of the modules, often inadequately understood, in a clockwork schedule taxes the abilities of all six team members.
The document-based course lends itself to utilization of a website[7] for publishing team products. I am in the midst of exploring the use of the Internet to provide thin-client access to server-based tools that can be shared by all team members regardless of their individual computer platforms [8]. I will next pursue video conferencing technology as a means of further assisting collaboration within the teams[9]. These innovations will help teams function more smoothly, but will probably not replace the long hours spent hashing out details in front of a workstation in some team member's den. Whatever the contribution, the flux of technologies will not disturb a course designed to display and immerse students in engineering decision-making.
References
1. Ballew, David, "A Senior Design Course for Computer Science," SIGCSE Bulletin (18,1) February 1986, pp.131-133.
2. Wortman D.B., "Software Projects in an Academic Environment", IEEE Transactions on Software Engineering, (13, 11), November 1987, pp.1176-1181.
3. Mitchell, William, and John Sigle, "The Software Engineering Course: What Should be its Objective?"
Journal of Computing in Small Colleges, (5,3) January 1990, pp. 66-72.4. Mitchell, William, "How Realistic Should a Software Project Course Be?",
Journal of Computing in Small Colleges, (6, 5) May 1991, pp. 58-61.5. Rettig, Marc, "Practical Programmer: Software Teams," Communications of the ACM (33,10) October 1990, pp. 23-27.
6. Mitchell, William, "Software Engineering Tools, A Mixed Blessing for Student Projects," Journal of Computing in Small Colleges, (13,2) November , 1997, pp. 164-170.
7. http://155.58.118.44
8. Mitchell, William, NSF Award Abstract #9751282, http://www.nsf.gov/cgi-bin/showaward?award=9751282.
9. Mitchell, William, NSF Award Abstract #9851282,
http://www.nsf.gov/cgi-bin/showaward?award=9851282.