Design of Interventions for Instructional Reform in Software Development Education for Competency Enhancement
also be used as a classification of competencies. Kennedy observed that different perspectives
were dominant in different professions and engineering education shifted its emphasis from the
first to the second perspective after the 1950s. Passow  has called for an appropriate balance
of all these four perspectives for designing engineering educational programs.
Categories of competencies expected of college graduates (Stark et al)
Stark et al  advocated to blend the professional and liberal education, and also classified
the competencies expected of college graduates into three broad categories: traditional
professional competencies, liberal professional competencies, and attitude. As per their
classification, traditional professional competencies comprised of conceptual, technical, and
integrative competencies as well as career marketability. The second category of liberal
professional competencies included interpersonal (communication), contextual, and adaptive
competencies as well as critical thinking and leadership capacity. The third category of attitudes
integrated professional identity, professional ethics, scholarly concern for improvement,
motivation for continued learning, and aesthetic sensibility.
Marzano’s Revised Taxonomy
In 2000, Marzano  proposed his modifications as a two dimensional taxonomy: (i)
knowledge domain comprising of information, mental procedures, and psychomotor procedures,
and (ii) processing in cognitive, meta-cognitive and self-system providing the following
hierarchical levels of processing:
1. Cognitive system: processes all the necessary information, and
d. Knowledge utilization
2. Meta-cognitive systems: sets goals and keeps track of how well they are being achieved
3. Self-system: decides whether to continue the current behavior or engage in the new
Various Competency Classification Schemes Cited by García-Aracil and Van der Velden
García-Aracil and Van der Velden  have studied the required competencies of graduates
with reference to the requirements of the new situation in the European labor market. They have
cited the following earlier competency classification schemes proposed in the last twenty years:
1 Becker: general and firm specific,
2 Nordhaug: firm specific, task specific, and industry specific,
3 Heijke: acquired at school and of direct use in later work, acquired at school which
facilitate acquisition of new competencies after school, and those that are acquired
mainly in work context,
4 Bunk: specialized, methodological, participative, and socio-individual, and
5 Kellerman: general academic, scientific operative, personal professional, socio-reflexive,
Coate’s schema for curriculum design
Kelly Coate  developed a schema for curriculum design. It included three overlapping
domains of ‘knowing,’ ‘acting,’ and ‘being.’ She suggested that the crucial aspect of this schema
is the domain of ‘being.’ Though all these models have been used by several education
researchers, they have not yet attracted any noticeable attention of computer sciences education
Annexure AN4: Metzger’s Observations about Debugging
Metzger observes that design errors may occur because of errors in data-structure, algorithm, or
interface specifications related to user-interface, software-interface, or hardware-interface
He has also enumerated some common conception stage errors in software development. The
data structure related errors are: missing/incorrect/unclear/contradictory/out-of-order data
definition, missing/incorrect/out-of-order shared-data access control, capacity limitation,
inappropriate representation resulting in data loss, ignored input or intermediate data storage
requirement, and slow access to data. Algorithm related errors include: invalid assumptions
about input/program state, omission of logical possibilities, high time-complexity, and
condition handler. Conception errors about interface include: invalid assumptions about users,
collateral software, or hardware, missing/superfluous/incorrect/unclear/out-of-order specification
item with reference to user interface, software-interface, or hardware-interface.
Metzger posits that coding errors include initialization errors, finalization errors, binding errors,
reference errors, static/dynamic data structure errors, memory problems, missing operations,
extra operations, control flow problems, value precision errors, invalid expressions, incorrect
usage of or defect in compiler/tools/system library/third-party library/operating system.
Errors because of rule-based reasoning
Metzger catalogues the software errors because of rule-based reasoning into two broad
categories: (i) misapplication of good rules occur when a time-tested rule is applied by
overlooking the additional conditions that warrant another rule, (ii) application of a bad rule
occurs when conditions are wrongly represented, or ineffective/inefficient action is chosen. The
first category includes errors in the sub-categories of ‘general rule, exception condition,’
‘conflicting signals,’ and ‘excess information, multiple rule match.’ Humans make rigidity errors
because they have a strong bias to apply techniques that worked in the past, even though the
current circumstances are no longer the same. In the second category of applying bad rules,
Metzger includes the subcategories of ‘incorrect rule conditions,’ ‘incorrect rule actions,’
‘frequently ineffective rules,’ ‘formerly effective rules,’ and ‘occasional effective rule.’
Debugging as a search problem
Metzger has viewed debugging as a search problem like mathematical problem solving that is
solved using a variety of search strategies like binary search, greedy search, depth-first search,
and breadth-first search.
Heuristics for solving debugging problems
(i) stabilize the problem, (ii) create a standalone test case, (iii) categorize the problem with
reverence to correctness, completion, robustness, and efficiency, (iv) describe the problem
according to a standard methodology, (v) explain the problem to someone else, (vi) recalling a
similar problem, (vii) drawing diagrams like control flow graph, data flow graph, and complex
data structures with pointers, and (viii) choosing a hypothesis from historical data.
He has also suggested some strategies like program slice strategy, deductive reasoning strategy,
and inductive reasoning strategy for debugging.
Annexure AN5: Lethbridge’s Study on Most Important and Influential Topics
in Software Development Education
Lethbridge [46-48] surveyed approximately 200 practicing software engineers and managers.
The respondents had degrees in computer science, computer engineering, electrical engineering,
information systems, software engineering, and other engineering disciplines. They represented a
broad cross-section of the industry, and developed software for management information
systems, data processing, consumer or mass market software, real-time systems, and other
application software. They were asked to rate educational topics on the basis of four criteria:
(Q1) how much they had learned about it in their formal education, (Q2) how much they know
now about it, (Q3) how important the topic has been in their career, and (Q4) how much
influence the topic had on their overall thinking.
Lethbridge included a total of seventy-five topics from thirteen subject categories in the survey.
The ten topics identified by them as most important in terms of career related utility of details of
topic and also overall influence on thinking were: specific programming languages, data
structures, software design and patterns, software architecture, requirements gathering and
analysis, HCI/user interfaces, object-oriented concepts and technology, ethics and
professionalism, and analysis and design methods.
In terms of the perceived gap between Q3 and Q4 compared to Q1, out of the thirteen subject
categories, the respondents felt most serious deficiencies in the three categories of software
engineering process, humanities and skills, and software design core.
Their report shows that five out of the thirteen subject categories did not contribute even a single
topic to the list of twenty-five most important and influential topics, while these categories were
felt by the respondents to be over emphasized in the curriculum. These subject categories are
theoretical computer science, mathematical topics in computer science, other hardware topics,
general mathematics, and basic science.
Annexure AN6: Some Important Models on Problem Solving
Jonassen’s Taxonomy of Problems
Jonassen  has proposed a taxonomy of problems based on variations in problem types and
representations. The problem types vary in a three dimensional continuous space of three factors:
structured-ness, complexity, and degree of domain specificity. The first among these is
structured-ness, varying from extremely well-structured to absolutely ill-structured in a
continuum, as discussed above. The second factor is complexity that depends upon a number of
issues, functions, variables, and also interactions and degree of uncertainty of behavior of these.
The third factor is degree of domain specificity.
Based on the cognitive task analysis of various kinds of problems, Jonassen has identified eleven
different kinds of problems – (i) logical problems, (ii) algorithmic problems, (iii) story problems,
(iv) rule using problems, (v) decision making problems, (vi) troubleshooting problems, (vii)
diagnostic-solution problems, (viii) strategic-tactical performance, (ix) situated case problems,
(x) design problems, and (xi) dilemmas. It may be noted that as per his classification, algorithmic
problems deal with direct application of known algorithms.
Polya’s Model on Mathematical Problem Solving
Polya  listed four phases of problem solving: (i) understand the problem, (ii) plan the
solution, (iii) execute the plan, and (iv) review the results. Table AN6.1 gives further details for
each of these phases.
Table AN6.1: Polya’s recommended cognitive engagement of mathematical problem solving
1. Understand the mathematical problems: (i) what is unknown, (ii) what is data, (iii) what is the condition, (iv)
is the condition sufficient/insufficient/redundant/contradictory to determining unknown, (v) draw a figure,
introduce suitable notation, and (vi) separate the various parts of the condition.
2. Plan for solution finding: (i) is the problem familiar, (ii) identify related problems, (ii) identify related
theorem(s), (iii) identify a similar problem that has been solved before for similar unknown, (iv) is the
solution plan of such problem reusable in terms of results and/or method (with some auxiliary elements, if
needed), (vii) restate the problem in different manners, (viii) go back to the definitions, (ix) if the problem
can’t be solved, solve some related problem that may be more general, more specific, more special, or
analogous, solve some part of the problem, (x) how far can the unknown be determined by dropping or
varying part of the condition, and can something useful be derived from this data? (xi) think of other data
appropriate to determine the unknown, (xii) can data and unknown be changed, and/or brought nearer to each
other? and (xiii) have all the data, condition, essential notions being considered?
3. Plan execution: Polya recommended engagements like –(i) carry out the plan and check each step, (ii) can
you see clearly that the step is correct? and (iii) can you prove that it is correct?
4. Review stage: (i) check the result, (ii) check the argument, (iii) can you derive the result differently? (iv) can
you see it at a glance? and (v) can the result or the method be used for solving some other problem?
Galotti’s Collation of Some Techniques for Solving Puzzle-like Problems
Galotti collates some general domain independent techniques for puzzle-like problems .
1. Generate and Test involves generating possible solutions and then testing them. It is useful
when there are only a few possibilities to track, and loses its effectiveness rapidly when there
are many possibilities, and when there is no particular guidance for the generation process.
2. Means-Ends Analysis consists of comparing the goals with the starting point, thinking of
possible ways of overcoming the gap, and choosing the best. If required, the sub-goals are
created to break down the task into manageable steps. It does not necessarily ensure the best
3. Working backward also reduces the gap between current state and the goal state by
determining the last step need to achieve the goal, then for next to last step, and so on. It is
very effective when the backward path is unique.
4. Backtracking involves making provisional choices, and unmaking the wrong choices if they
turn out to be wrong so that one can back up to a certain point of choice and start over again
by making newer choices.
5. Reasoning by analogy works when the problem solver is able to form an abstract schema of
the presented stories, and apply the same to new analogous problem. Research has shown
that not many persons are able to form such schema and see the analogy unless told to do so.
Nickols’ typology of problem solving approaches
Nickols proposed a typology of problem solving approaches . A repair approach is required
to put things back the way they were, improvement approach is required to improve upon
existing arrangements, and engineering approach is suitable for creating new, far superior
arrangements. The repair approach starts from symptoms and focuses on causes/corrective
measures through fault isolation. The improvement approach starts from existing
systems/arrangements and focuses on constraints/modifications through structural analysis. The
engineering approach starts from the required results and focuses on required design through
16 Habits of Mind, Costa and Kallick
Costa and Kallick  have identified the following sixteen characteristics of what intelligent
people do when they are confronted with problems, the resolution to which is not immediately
apparent - (i) persisting, (ii) managing impulsivity, (iii) listening to others with understanding
and empathy, (iv) thinking flexibly, (v) thinking about our thinking (meta-cognition), (vi)
striving for accuracy and precision, (vii) questioning and posing problems, (viii) applying past
knowledge to new situations (ix) thinking and communicating with clarity and precision, (x)
gathering data through all senses, (xi) creating, imagining, and innovating, (xii) responding with
wonderment and awe, (xiii) taking responsible risks, (xiv) finding humor, (xv) thinking
interdependently, and (xvi) learning continuously.
Annexure AN7: Some Theories on Attention
Galotti gives an excellent account of their findings on this aspect . We give a brief
summary in Annexure AN6. The term ‘selective attention’ means that we usually focus our
attention on one or a few tasks or events rather than on many. In 1958, Broadbent proposed his
‘filter theory’ which specified that we could only attend to one stimulus at a time. In the 1960’s,
Anne Treisman proposed her ‘attenuation theory’ as a modification to the filter theory. She
suggested that rather than being fully blocked and discarded, unattended signals are weakened
and some information is retained for future use.
In the 1960’s, Deutsche and Deutsch, and also Norman, proposed their ‘late selection theory,’
taking a position that all messages are routinely processed for at least some aspects of meaning –
the selection of message for response happens later. At low level of alertness, only very
important messages captured attention, whereas at higher level of alertness, less important
messages can be processed. In 1978, Johnston and Heinz proposed a broader model in the form
of ‘multimode theory,’ which viewed attention as a flexible system that allows selection of a
message over others at several different points. Later selection requires more processing,
capacity, and effort.
In 1973, Kahneman presented his model of attention viewing that the availability of mental
resources is affected by overall level of arousal, or state of alertness. In the 1980’s, Anne
Treisman showed that perceiving individual features takes little effort or attention, whereas
gluing features together into a coherent object requires more. As per the ‘capacity theory of
comprehension’ proposed in 1991, differences in working memory capacity of individuals can
account for qualitative and quantitative differences in comprehension. In 2001, Conway et al
showed that lower capacity of working memory results in lesser ability to focus. Research has
shown that practice plays an enormous role in performance on simultaneous dual tasks but there
are serious limitations on the number of things we can do simultaneously. Complex individual
tasks make it even more difficult.
Annexure AN8: Some Important Perspectives on Curiosity
Arnone  cites Daniel Berlyne, who in 1960’s had identified two form of curiosity -
diversive (e.g., novelty seeking) and specific (e.g., uncertainty, conceptual conflict, information
seeking). Arnone also refers to Loewenstein’s information gap theory of specific epistemic
curiosity, according to which a feeling of deprivation occurs when an individual becomes aware
of a difference between “what one knows and what one wants to know.”
Peterson et al  view curiosity as one of the core cognitive virtues for all humans. According
to their meta-analysis of various philosophical perspectives and research findings curiosity
includes interest, novelty seeking, and openness to experience. It implies taking an interest in
ongoing experience for its own sake, finding topics and subjects fascinating, as well as
tendencies for exploring and discovering.
Peterson et al have given an excellent account of research on curiosity . Cognitive process
theory of curiosity results from two traits of openness to novel stimuli and a concern for
orderliness. According to this theory curiosity is a function of assimilating and accommodating
novel stimulus into one’s cognitive map. Personal growth facilitation model of curiosity suggests
a four step process – (i) allocation of attention and energy for recognizing and pursuing cues of
novelty and challenge, (ii) cognitive evaluation and behavioral exploration of challenging
activities, (iii) deep absorption of these activities, and (iv) integration of curiosity experience
through assimilation and accommodation. In seemingly boring situations, highly curious people
are more oriented towards finding novelty and also sensitive to cues that can increase interest in
meaningful and unavoidable activities. Peterson et al cite research that has shown that in college,
students with a high curiosity trait asked five times more questions than students with a low