13th International Conference on Software Quality
Software Division of
Concurrent Sessions - Wednesday, October 8, 2003:
9:45 a.m. - 10:45 a.m.
Encouraging organizations to follow a defined workflow/process and assessing the level of compliance to it can be much easier with automated support for process and documentation workflow. This presentation will describe a newly developed, in-house system based on an enterprise workflow development tool. The organization developing this system employs a mainframe-based production environment with Windows® desktop machines over an NT network and has been using LotusNotes®-based methodology/workflow systems for many years.
All too often teams rush from gathering requirements into design, failing to fully analyze the customer’s needs. How can we move from a collection of use cases to a robust system design? Can we reliably predict the behavior and responsibilities of components shared between different subsystems? Or, do we discover “you can’t get there from here”?
The Unified Modeling Language (UML) provides a rich vocabulary and notation with which we can explore requirements. Robustness analysis (also known as use-case analysis) provides an important, but often overlooked, step in the development process. This analysis technique gives us the opportunity to refine our understanding, and to resolve any questions or issues that may arise – before construction.
This talk demonstrates some advantages and principles of using the UML during robustness analysis. Examples will focus on use-case realization, firming up our understanding of a system’s architecture, and tying our notion of business objects to artifacts we will refine during the design step.
James Bielak is President of Greenstone Consulting and mentors and assists Fortune 100 clients with business modeling, requirements acquisition and management, system architecture, and improving communication among project stakeholders. James has extensive industrial experience applying object-oriented technology as a project manager, system architect, technology mentor, and research scientist.
The Software Project Management (SWPM) certificate program at The University of Texas at Austin’s Software Quality Institute (SQI) has been in place for 10 years. According to graduates, the course has been very effective in delivering information, yet SQI continually evaluates the program to determine if it is achieving all of its goals.
This presentation describes the certification program, explains three surveys of graduates, examines program strengths and weaknesses, and speculates on a report that students have had somewhat less of an effect on their organizations than either they, or the program sponsors, would desire.
Taught by software practitioners, SWPM aims to provide skills and techniques for students to improve their company’s software processes. SWPM goals are to help students achieve personal career objectives, increase their ability to manage/lead software projects, increase their insight into job responsibilities, and have a better awareness of project management professionalism.
Linda Shafer is the Director of the Software Quality Institute at The University of Texas at Austin, and an instructor and course mentor. She has held positions of programmer, designer, analyst, facilitator, SQA, project leader, and manager in both industry and academia. She has a degree in Math from The University of Texas at Austin and an MBA from The University of New Mexico. She has written or co-authored over two dozen articles for publications in the software industry and four books. Ms. Shafer is a certified Project Management Professional and a Certified Software Quality Engineer. email@example.com
Herb Krasner is a software excellence consultant. His mission, spanning several decades, has been to enable the development of superior software, and to stamp out poor quality software, wherever found. As a coach, mentor and/or troubleshooter, he has helped many organizations become more effective producers of superior software products and information systems. He is best known for his leading edge work on modeling the costs of software quality, reporting the ROI data for software process improvement, coaching organizational improvement programs and the reported results from the empirical studies of professional programmers. He has published over 55 papers, articles and book sections, and has spoken at many professional conferences and meetings. Over his 30 year career in computing, he has held positions as: President, Division Manager, Chief Technical Officer, Project Leader, Chief Software Engineer, Senior Technologist, Assistant Professor of Computer Science, and Management Consultant. He is also a Senior faculty member at the University of Texas at Austin, where he teaches classes in Java programming, software engineering and emerging development best practices. He is the Founder and active participant in the UT Software Quality Institute. firstname.lastname@example.org
Don Shafer is a co-founder, corporate director and Chief Technology Officer of Athens Group, Inc. Incorporated in June 1998; Athens Group is an employee-owned consulting firm, integrating technology strategy and software solutions. Prior to Athens Group, Shafer led groups developing and marketing hardware and software products for Motorola, AMD and Crystal Semiconductor. He was responsible for managing a $129 million-a-year PC product group that produced the award-winning audio components for the Apple iMAC. From the development of low-level software drivers in yet-to-be-released Microsoft operating systems to the selection and monitoring of Taiwan semiconductor fabrication facilities, Shafer has led key product and process efforts. In the past two years he has led Athens engineers in developing industry standard semiconductor fab equipment software interfaces, definition of 300mm equipment integration tools, advanced process control state machine data collectors and embedded system control software agents. Beginning in 1971, he managed analysis teams in Germany and Southeast Asia. In the 1970s and 1980s he managed joint technology development teams in the US and Europe. In the 1990, while at Motorola, he was the software manager for a product where the first customer was in Great Britain, the semiconductors were developed in Israel, and the deliverable software was being done in Bangalore and the prototyping in Austin. He earned a BS degree from the USAF Academy and an MBA from the University of Denver. In addition, Shafer’s work experience includes positions held at Boeing and Los Alamos National Laboratories. He is currently a graduate lecturer in software engineering at Southwest Texas and the instructor-in-charge of the University of Texas Software Project Management Certification Program. With two other colleagues he wrote a software practitioners’ project management guide for Prentice-Hall. email@example.com
Why do projects “fail” in the test phase? Do teams have inadequate testers? Are they not doing adequate testing? Is the entire test organization inadequate?
We’ve heard blaming comments about testing and product quality since organizations decided they needed testing groups. Instead of blaming the testers for being inadequate, let’s talk about the testing. Why are so many organizations dissatisfied with their testing activities and perceive those activities to be inadequate? Why are we building organizations whose staff can’t perform the required work?
In this presentation, Rothman will look at how organizations tend to hire testers and other technical people, and alternatives to those techniques. This talk is for you if you’d like to refine your hiring requirements for great testers (or other great technical staff), or if you’d like to become a great tester (or be hired as one).
In general, agility is the characteristic of being ready and able to move with quick and easy grace. In software development, this concept has come to mean the ability to develop valuable software quickly, in the face of rapidly evolving requirements. Due to marketplace pressures, agility is an emerging concern for many of today’s modern software organizations. As a software professional in the next decade, you will need to become agile. Agile methods are relatively new on the software scene and are based on a set of philosophies and assumptions that are different from the conventional life cycle models of the past.
Some process pundits have claimed that agility and quality are mutually exclusive concepts, and that agility simply promotes hacking. The agilists on the other hand claim that overly rigid process management detracts from the goal of delivering valuable functionality in a timely manner. This talk will reveal the underlying and hidden issues of this discussion, and will define a situational fit model for assisting in best-practice adoption decisions, leading to the delivery of superior software. Clearly, understanding the true value of the functionality for delivery in a product/system is an area where we can best find the leverage.
In today’s business environment, as technology permeates almost every facet of life, the likelihood of software engineering professionals working in nonsoftware oriented organizations is very high. This also almost guarantees that people with little or no knowledge of software engineering processes will be key stakeholders in the software engineering process. Designing and documenting an effective software process in this environment can be quite a challenge. If a process is not carefully designed and clearly documented, it will be destined to fail.
This talk provides tips and techniques the presenter has used in the past to successfully design and document effective software engineering processes.
Angelica Arceneaus-Dobson has over 15 years of experience in the software industry. Angelica has designed and re-engineered processes in the areas of Design Verification, Software Testing, Requirements Management, Defect Management, Software Development, and Release Engineering/Management. Angelica has designed processes to meet various corporate and industry standards including, IEEE, ISO and DO-178B.
Angelica started her career at IBM and then went on to work for WRQ Inc, Applied Microsystems Corp. and is currently employed with i2 Technologies where she holds the position of Sr. Program Manager with overall responsibility for helping to define their corporate Total Quality Management System.
Project management always involves effectively balancing the scope of effort with the resources available and within an acceptable or predetermined time frame. This presentation discusses the real relationships and tradeoffs between the three key variables of work to be done (scope of effort), resources devoted to that work (both personnel and dollars), and time as defined by the project schedule or ship date. How can we manage these relationships effectively to consistently deliver projects that meet customer objectives on schedule and within budget, with an acceptable level of quality?
James A. Ward is an independent management consultant specializing in system development project management and implementation of Quality Improvement initiatives in information systems. He holds a Bachelor’s Degree in Economics from the University of Minnesota and an MBA in Finance from the University of Chicago. Mr. Ward is a member of ASQ, SPIN and PMI. He holds a PMP certification from PMI. Previously he has held positions as Systems Manager with a $1 billion aerospace manufacturing firm, the largest HMO in Florida and a large teaching hospital.
Checklists are important tools for the inspection process and a comprehensive definition of their items can directly affect the inspection efficiency. On the other side, since different software work products have different characteristics and features, using similar checklists for all of them isn't effective, even when those checklists are very useful for specific products.
This talk defines a semiformal method for software inspection. This method reveals suitable information to produce checklist items based on software specification by highlighting the critical features and areas of the software. Concentrating the inspection on critical areas results in optimum use from time and effort.
Another important component of the inspection process is the recording of fault statistics. This talk proposes formulas to calculate the majority of a defect, so defects can be recorded by their severity. By utilizing information gathered during the inspection process, this method also defines a quality criterion useful in determining if the inspection was successful.
Mehran Sharafi has been a member of the academic board of computer engineering department at the university of Najafabad, Isfahan, Iran for 3 years, teaching Software engineering, Data structures and Algorithms, Discrete structures, and advanced programming for undergraduate level in these years.
She is a PHD student of Software Engineering at the Azad university of Tehran (research and science branch) since last year, and Project Manager in Design and implementation of Information systems integration at AZAR Refractories corporation-Isfahan-Iran for the last 2 years.
She holds the B.S. Degree and M>S> degree in Computer Engineering/Software from Azad university of Najafabad. She is currently a PHD student in Computer Engineering/Software, Science & Research at Azad university of Tehran (Since February 2001). Her areas of research are Software Verification and Validation, Testing, Inspection , and Software Architecture.
3:00 p.m. - 4:00 p.m.
Presentations on the Certified Software Quality Engineer (CSQE) certification program and body of knowledge covering the value of the CSQE, the topics in the body of knowledge, the certification program, and the upcoming Software Quality Engineering Handbook.
has more than 25 years experience, principally in software quality.
He works for Accenture in the Government Practice.
Doug was the
lead for CMMI rollout
to the State & Local government practice. He successfully helped the
United States Government practice obtain CMMI Level 3 credentials in 9
Many organizations try to implement change. This includes everything from the introduction of a new tool or method, to a companywide process improvement program. Often these attempts fail because too much is attempted too quickly with little thought to the most effective sequence of events. When new ideas are introduced they are either abandoned after a short time or adopted by only a few people.
Whenever a Software Engineering Process Group (SEPG) or process improvement team wants to deploy a change, there are some key principles that it should consider to be successful. This talk covers the use of an adoption curve that categorizes an SEPG’s target audience into five groups. Understanding these groups helps the improvement team to increase the speed of deployment, by determining who to work with and in which order, to reduce the risk of failure by building and deploying the solution in increments, and to determine when a policy should be developed and an edict issued.
As software professionals, we’re aware of the negative publicity surrounding software development and the resulting products. Reports abound describing cost overruns, ineffective and incorrect products, and failed projects.
Consider best practices learned in school or through work experiences. Are they still being applied into your daily work? Have they been abandoned because of limited cycle times and intense schedule pressures? Has the weak economy forced you to stretch your resources by doing more with less? How are you addressing increasing concerns about software safety and security, liability, and reliability? This talk focuses on the basics, and how one might use the basics and fundamentals for improvements on a personal basis. Focus on reducing defects or on being more productive in areas that will deliver value to you and your organization.
Trudy Howles worked many years as a software consultant, manager, and senior development engineer. She is currently an Assistant Professor in the Department of Computer Science at the Rochester Institute of Technology. Trudy is a member of the ASQ and is a CSQE.
Process, performance and productivity are three of the top buzzwords today – all of which can be measured and monitored through the effective use of measurement and analysis. So, why aren’t more companies readily embracing measurement in their IT and quality departments? While measurement is not rocket science, neither is it a trivial or inexpensive endeavor. Join seasoned metrics professional Carol Dekkers as she explores how to Tame your Process – through the effective and realistic use of measurement.
Carol A. Dekkers is President of Quality Plus Technologies, Inc.
4:15 p.m. - 5:15 p.m.
This presentation will describe a Process Measurement FrameworkSM that rapidly achieves measurable results. The Process Measurement FrameworkSM is based upon the popular Goal/Question/Metric (G/Q/M) paradigm, the Juran Quality Trilogy, and the initial core measures recommended by the Software Engineering Institute (SEI). The G/Q/M Paradigm is applied to the goals of planning, control, and improvement and based on powerful metrics that have a proven track record. In order to illustrate the power of the Process Measurement FrameworkSM, real examples from industry are used. The Process Measurement FrameworkSM helps to ensure that all metrics are collected on a form, in a document, or in a database.
Timothy G. Olson is Founder and President of Quality Improvement Consultants, Inc (QIC). While performing quality consulting, Mr. Olson has helped organizations measurably improve quality and productivity, save millions of dollars in costs of poor quality, and has helped numerous organizations reach higher CMM maturity levels. Mr. Olson has been formally trained in Crosby, Deming, Juran, CMM, and CMMI quality approaches. Mr. Olson is also a Juran Institute Associate. Mr. Olson was a lead-author of a Software Quality Course for the University of Minnesota, and he is currently a member of ASQ and IEEE.
Developers of the Capability Maturity Model (CMMÒ) believed that the capture and dissemination of best practices throughout the software industry would result in improvements to an organization’s bottom line through quality and process improvement. However, correlation of model-based practices with measurable benefits has been largely theoretical and conceptual. In response to the growing need to demonstrate quantitative value of improvement effort and dollars. Join Cagley in a discussion of a process he has developed for a joint model-based and productivity assessment. This assessment process has been deployed confirming strong correlations between productivity assessment results and model-based strengths and weaknesses. Cagley will discuss how recommendations for matching the organization’s business goals with specific process changes were generated. Each recommendation targeted improvement in productivity and reductions in time to market, delivered defects, total defects, and maintenance effort. The resulting recommendations provided the organization with a road map to better leverage effort and cost for true benefit in the next improvement cycle and beyond.
Thomas Cagley is a Managing Senior Consultant for The David Consulting Group. He is an authority in guiding organizations through the process of integrating software measurement with model-based assessments to yield effective and efficient process improvement programs. Mr. Cagley is a recognized industry expert in the measurement and estimation of software projects. His areas of expertise encompass management experience in methods and metrics, quality integration, quality assurance and the application of the Software Engineering Institute’s Capability Maturity Model® to achieve process improvements.
Mr. Cagley has managed many types of projects within the information technology field including large scale projects software development, conversions and organizational transformation projects. Based on his expertise Mr. Cagley managed the development of an internal project management certification program for software project managers for a major bank holding company. He has also managed and performed quality assurance (technical and process) for large information technology organizations.
Mr. Cagley is a frequent speaker at metrics, quality and project management conferences
The software industry is painfully discovering is that the tactics and techniques that were successful 10 years ago for established companies or even those used last year by new entrepreneurial start-ups are now failing to deliver expected results. The entrepreneurial company Streun worked for was also encountering difficulties due to the “code, ship, and the customer will test” technique - it just didn’t hit the mark any longer. This type of process did not have the required focus to provide appropriate levels of effort for all aspects of the delivery triangle - time, cost, and quality. Our customers were demanding higher quality products, better service, and lower cost products, while also expecting more and more advanced functionality in every new release.
The focus we developed was using a “Practical Approach©” to improve the development process because we could not get full and visible commitment from the highest level of the company’s management. The CMMÒ was tailored, then the project manager initiated and implemented process improvement changes within his or her sphere of influence. The ongoing success of that team showed up on the bottom line and built the needed support to drive further improvements.
Geree Streun received her Master’s degree in Computer Science from Southern Methodist University and Bachelor’s, also in Computer Science, from Kansas State University.
She has an extensive software development background including both real-time and interactive software development. She has specialized in Software Process Improvement for the past nine years and has a wide range of experience in process improvement activities. She has led or participated in numerous software Capability Assessments and Evaluations across many different companies and has planned and implemented various process improvement models which support the continuous improvement efforts required to produce high quality products. Ms. Streun was the Leader of the International Software Engineering Process Group (SEPG) for Fisher-Rosemount Systems, the largest supplier of Engineering Process Systems in the world.
She is a member of ASQ and holds the CSQE certification. She was an Instructor and past Mentor for the University of Texas Software Quality Institute’s Software Project Management Certificate Program. She is a Senior Member of IEEE, a member of ACM and the Function Point User’s Group, an Affiliate of SEI, and is a past President of the Austin Software Process Improvement Network (SPIN).She is a PMI member and holds her PMP. She is currently the lead of an international team re-writing a key portion of PMI’s newest version of the PMBOK.
What is clinical data quality and why do you care? In an effort to capture the expertise of software quality professionals to help regulated industry more efficiently handle data quality issues, several topics will be discussed in this interactive session:
1. Explain the state of regulated industry with regards to data quality?
2. Describe the trail of data through the life cycle
3. Discuss what data quality is and what why it is important
4. Discuss processes to assure data quality from a software engineering perspective.
Sue Carroll is a principal software quality analyst at SAS having worked in Quality Assurance and Process & Research. She is a Certified Software Quality Engineer (CSQE) from the American Society for Quality (ASQ) as well as an ASQ Fellow. She received a post-bachelor in computer science from NCSU. Previously, Sue was a registered nurse, earning her bachelor’s degree in nursing from NCCU. She is the editor-elect of Software Quality Professional Journal, editor for the working group revising Standard 829 on software test documentation and the Senior Software Quality Advisor to the Board of Directors for Data Quality Research Institute
Kaye H. Fendt is a consultant. She served for four years as a Regulatory Health Information Specialist in CDER, Food and Drug Administration and was a Fellow and a Co-Coach in the Council for Excellence in Government. Ms. Fendt was a founding director of CDISC and served on the Board of Directors for 4 years. She served on the Board of Trustees of the Society of Clinical Data Management where she initiated the Good Clinical Data Management Practices Document. She is currently the liaison between the GCDMP committee and the FDA and is developing GCDMP training for FDA inspectors.