Construction and Initial Validation of the Inventory of Study Skills

c 2012 Ateneo de Naga University
ISSN 1655-7247
Gibón vol. IX (2011) pp. 3–34
Regular Research Article
Construction and Initial Validation of the
Inventory of Study Skills and Attitudes (ISSA)
for Filipino Students
Margarita Felipe-Fajardo
Department of Literature and Language Studies
Ateneo de Naga University
Abstract
This study describes the development and initial validation of the
instrument, Inventory of Study Skills and Attitudes (ISSA) designed to assess the study skills of Filipino college students. The
development of the ISSA consisted of three phases. In the first
phase, items were constructed based on cognitive, meta-cognitive,
and motivation theories. The ISSA underwent three revisions before its final version. The initial and second versions of the ISSA
were administered to a small group of freshman college students
in the three universities in Naga City. Using item-total correlation
and reliability analyses to assess the psychometric quality, the first
version of 279 items was reduced to 171 in the second version and
84 in the third. Exploratory factor analysis of the third version
administered to a large sample revealed that the ISSA is a multidimensional instrument consisting of three constructs: Meaningful
Learning, Self-Regulation, and Planning & Organization. The final inventory consisting of 28 items has an excellent internal consistency of .90, while each of its subscales shows alpha coefficient
values of .86, .79, and .81, respectively. Implications of the findings
of this study in teaching freshman college students how to develop
appropriate study skills and attitudes for academic success are discussed.
Keywords: learner autonomy, study strategies, study skills, metacognition, motivation, factor analysis, test validation, Filipino college students
4
Inventory of Study Skills and Attitudes
The problem of academic underachievement among first year
college students is a prevalent issue in higher-education institutions.
In high school, learning is generally teacher-directed in terms of goalsetting, content, and process. This general framework of directed
teaching and learning in secondary schools encourages most students
to coast through school, passively receiving knowledge (Kidwell,
2005). When they enter college, however, they soon realize that
college life is full of challenges such as adjustment to different roles,
and juggling multiple tasks (Petersen, Lavelle, & Guarino, 2006).
Because of their constant exposure to tasks that are difficult and
unfamiliar, it leads to their frequent experience of failure. Moreover,
academic competition is tight, putting more pressure on them to excel. There are also the problems of building new social relationships
and making critical choices in relation to their career (Stupnisky,
Renand, Daniels, Haynes, & Perry, 2008). If college freshman students are not given support at this point, they may belong to what
Goetz and Palmer (1991) label as “academically-at-risk” university
students. They are the ones prone to academic failure as characterized by significantly below-average academic performance (Kayler &
Sherman, 2009), and who are twice more likely to drop out of school
than their achieving counterparts (Tuckman, 2003).
Thus, to help increase retention in college and eventually ensure high graduation rates, many higher institutions implement intervention programs to help improve students’ academic performance
and enhance their motivation to learn. These intervention programs
are usually in the form of offering a separate study skills course
where at-risk students, voluntarily or by referral, attend a usually
non-credit course to learn skills in note-taking, reading, memorizing,
examination, etc. to help them cope with the academic demands of
their regular courses. In the Philippines, these skills are taught in the
course, Study and Thinking Skills, a credit course suggested by the
Philippine Commission on Higher Education to be taken by college
students in their first year in college.
However, one problem encountered by the instructors handling
this course is deciding on which particular study strategies to teach.
Study strategies are “a group of systematic procedures or activities
applied during learning that support students’ active manipulation of
text content and other materials” (Meneghetti, De Beni, & Cornoldi,
2007, p. 629) and cover a wide range of strategies on note-taking,
Margarita Felipe-Fajardo
5
organizing, scheduling, concentrating, storing information (Yip &
Chung, 2005), to name a few. One way for the instructors to determine which strategies are the most effective to teach to students is
to use validated study skills questionnaires.
Many studies have been done on college students’ study skills
using a number of questionnaires. These studies wanted to find out
what factors lead to academic success.
Cognition
A meta-analysis of 52 studies on study skills by Purdie and
Hattie (1999) revealed that study skill intervention programs work
most of the time. Students who underwent strategy training course
earned significantly higher grade point averages in comparison to
those who were not given such training (Robyak, 1978; Tuckman,
2003). Students’ use of more and varied strategies, as measured by
their total score in a study skills instrument, positively correlated
with cognitive and affective outcomes (Young & Ley, 2000). The
sequential pattern followed in using the strategies also distinguishes
the successful learners from the unsuccessful ones (Caverly & Flippo,
2000).
Based on the overall results of these intervention programs, the
best predictor of students’ success is their study strategies (Kitsantas,
Winsler, & Huie, 2008; Tuckman, 2003; Yip & Chung, 2005) with certain study strategies consistently linking with high academic performance. Students who use deep-processing strategies are the ones intrinsically motivated (Phan, 2009) leading to academic achievement
(Purdie & Hattie, 1999). Also, students who engage in meaningful
and directed practice (Young & Ley, 2000) such as using strategies to
activate their background knowledge, question, predict, and clarify
as they read, are likely to comprehend the material better (Crandall,
Jaramillo, Olsen, & Peyton, 2002; Meneghetti et al., 2007). Information is retained in long-term memory when students are taught
how to organize information in the text by summarizing, making
an outline, or using graphic organizers (Moreno & Martin, 2007;
Tuckman, 2003). Other study strategies relating positively to academic performance are concentration techniques, critical thinking
skills (Stupnisky et al., 2008), efficient searching of information during research, and problem solving (Allgood, Risko, Alvarez, & Fairbanks, 2000). Students’ epistemological beliefs are also connected
6
Inventory of Study Skills and Attitudes
with their use of study strategies. Those with sophisticated beliefs
about the nature of knowledge and learning tend to be good strategy
users (Cole, Goetz, & Willson, 2000).
On the other hand, there are some strategies linked with negative outcomes such as rote learning or increasing time on task. Unsuccessful students who view knowledge as something that involves
memorizing facts and formulas tend to use the surface approach to
learning (Allgood et al., 2000; Purdie & Hattie, 1999).
Metacognition
Successful learners not only know how to use cognitive rules
but are also highly introspective of how they learn (Stewart & Landine, 1995). Academic achievers implement a study plan, know how
to use good and useful strategies appropriate to the academic context
and monitor their study behaviors (Cukras, 2006) through the use
of such strategies as self-testing, self-reinforcement, self-instruction
(Young & Ley, 2000) or deep elaboration of the material (Allgood
et al., 2000; Meneghetti et al., 2007). Successful students are also
able to identify their weak areas, seek help when necessary, evaluate
and adjust their performance after support has been given (Fasset,
2002).
Self-efficacy is also another factor which may lead to students’
academic success. Students most likely to do well in school are those
who believe that they have the ability to succeed if they are willing
to exert effort to finish a task and take full responsibility for the
outcome of their behaviors in school (Allgood et al., 2000; Young &
Ley, 2000).
Moreover, students’ ability to set goals for themselves determines better academic performance. When students perceive the
importance of a task as an instrument in fulfilling their academic
objectives, the more they strive to meet those long-term goals using
deep processing strategies (Lizzio & Wilson, 2004; Phan, 2009).
Motivation
Motivation plays an integral role in the academic achievement
of a student. Studies show that students who are highly motivated
in school display the following characteristics: They set achievement
goals for themselves (Kitsantas et al., 2008; Tuckman, 2003), value
mastery of the material (Balduf, 2009), see the relevance of the task
Margarita Felipe-Fajardo
7
to their present course of study (Lizzio & Wilson, 2004), and make
extra effort to accomplish a goal because of the expectation of an
extrinsic reward important to them (Meneghetti et al., 2007). They
are also self-determined (Fasset, 2002), intrinsically absorbed in the
academic task (Allgood et al., 2000) and able to persistently sustain their learning even in the most difficult contexts (Young & Ley,
2000).
On the other hand, students with low internal motivation are
those who, by not having a clear purpose for their behaviors, are usually bored in class, detached from their tasks, and cannot predict the
consequences of their actions (Legault, Green-Demers, & Pelletier,
2006).
Other Contextual Factors
Other factors associated with successful learning are the ability of the student to use time wisely (Kitsantas et al., 2008), cope
with stressful situations (Petersen et al., 2006), situate themselves in
environments that foster learning (Young & Ley, 2000). Even a student’s personality plays a role. Students with positive self-concept,
high self-esteem, and a positive attitude toward learning are more
likely to succeed in school than those who are pessimists (Allgood
et al., 2000), perfectionists, unable to take risks (Balduf, 2009), or
adjust to multiple roles when faced with difficult academic situations
(Petersen et al., 2006).
Less frequently mentioned factors in literature but nevertheless related to academic success are the support that students receive
from their families, the ability to socially interact with peers, their
commitment to finish college (Kitsantas et al., 2008), and their general preparation for and adjustment to the greater demands of a
college education (Balduf, 2009; Grimes & David, 1999; Legault et
al., 2006).
Purpose of the Study
While there are many scales exploring the factors affecting students’ study skills, there is a need for an instrument that is contextspecific to the experience of the Filipino college learners. Moreover,
the instrument should be such that it could be administered locally
for diagnostic and practical purposes.
8
Inventory of Study Skills and Attitudes
The study skills assessment instruments that have so far been
developed are made by westerners and majority of these instruments
are used as predictors of students’ academic achievement. To the
best knowledge of the researcher, no study skills and attitudes scale
has yet been constructed, validated and published to diagnose Filipino college students’ learning strengths and weaknesses to provide
individualized remedial training. When Filipino students are made
to use study skills instruments developed in the West, they may
encounter difficulties in their content and language. Moreover, existing assessment instruments are commercially available but the cost
of purchasing these assessment tools for administration to freshman
students may be too costly for most of the universities in the Philippines.
The main goal of this study was to construct a short study
skills questionnaire which includes cognitive, meta-cognitive, and
motivational components of learning. This instrument is designed
to diagnose Filipino students’ skills, strategies, and attitudes toward
studying. The development of the Inventory of Study Skills and Attitudes (ISSA) consisted of three phases. The first and second versions
of the ISSA were administered to a small group of freshman college
students in the three universities in Naga City. Psychometric quality
was assessed by using item-total correlation and reliability analyses.
After the third version of the ISSA was administered to a large sample, initial validity of the ISSA was established using exploratory
factor analysis.
Methodology
Construction of the First Version of the ISSA
Item Selection
The first step in the construction of the ISSA was to review
a wide range of sources which included books, journals, educational
online sources, questionnaires on study skills and educational motivation to identify the study skills domains. Based on this research,
ten domains seem to be recurrently mentioned in literature: Memory,
Concentration, Examination, Note-taking, Reading, Writing, Class
Participation, Time Management, Health Management, and Motivation. These were the same domains included in the initial inventory.
Margarita Felipe-Fajardo
9
The following rules were used to limit the range of scales and
items during the construction stage: (a) Each item should represent
one of the ten domains of study skills and attitudes defined in the
study, and should describe a cognitive, meta-cognitive, or motivational technique or strategy; (b) The item should focus only on behavior or skill that can be taught, altered, and subjected for remediation; (c) The statement should be phrased such that it is answerable
in a continuum from Never–Always (Ashmore, Del Boca, & Bilder,
1995); (d) The item must be oriented to the individual as an “I” or
“me” statement (Piazza & Siebert, 2008); (e) No English idiomatic
expression is to be used in constructing the items (e.g., “I tend to see
the forest for the trees when studying”) so that it is understandable
to students learning English as a second language; And (f) the item
should be either positively-worded (e.g., “I study during the time
when I am most alert”) or negatively-worded (e.g., “I submit assignments or projects after the deadline.”). This last guideline was
set to prevent the threat of an acquiescent-response style. Cronbach
(1950) used the term “acquiescence” to mean the tendency of respondents to simply agree with positively-worded items because doing so
would entail less cognitive effort. The presence of negatively-worded
items, on the other hand, will ensure that the subjects will process
each statement more carefully (Antonak & Larrivee, 1997). Based
on these guidelines, 295 items were initially constructed for the item
pool.
Two field experts were asked to examine this initial item pool
to ensure its face and content validity. Based on their feedback, item
revisions were made. For example, some closely-related ideas were
compounded in one item; Items which did not discriminate between
good and bad study behavior were reworded; Items that did not
directly deal with study practices were deleted; The ‘Health Management’ domain was renamed ‘Stress Management’ so that items
focus directly on how students cope with their school-related problems; And some items were re-categorized to fit the definition of the
domain. Although there were some items that seemed to fit more
than one category (e.g., “I schedule time for fun” could both be a
strategy for time and stress management), these items were retained
since the results of the factor analysis in the latter part of the study
would determine which category the item would fit. The experts also
suggested some items for inclusion. This process led to the creation of
10
Inventory of Study Skills and Attitudes
the first version of the ISSA consisting of 279 items. This first version
consisted of items in the areas of Concentration (33), Class Participation (18), Memory (19), Motivation (40), Note-taking (30), Reading
(24), Stress Management (18), Examination (43), Time Management
(33), and Writing (21).
Instruments
The Brief Social Desirability Scale (BSDS).
A self-report of students’ use of study skills may invite some
respondents to present themselves in a positive light and affect the
validity of the participants’ responses. To prevent the invalidation
of response due to social desirability bias, this study incorporated
Haghighat’s (2005) Brief Social Desirability Scale (BSDS) in the two
pilot tests to screen out respondents who exhibited a high tendency
to give socially desirable answers (Merydith, Prout, & Blaha, 2003).
The BSDS asks students to answer four questions with either “yes”
or “no.” In this study, respondents who answered “yes” (the socially
desirable response) to more than two questions out of the four were
excluded from the two pilot tests. The purpose of the BSDS, however,
was deliberately withheld from the respondents to prevent them from
giving the expected answers.
The Inventory of Study Skills and Attitudes (ISSA).
The ISSA is a self-report inventory which asks students to cite
the frequency with which they use or adopt a particular study skill
or attitude. To prevent the threat of the non-informative mid-point
response style, students were made to rate their frequency of use on a
6-point continuum from “Never true of me” to “Always true of me.”
In the encoding of data, a score of 1 indicates “Never” and a score of 6
indicates “Always” for positively-worded statements, whereas reverse
scoring was used for negatively-worded statements. Each survey form
contained representative items from the ten defined domains of study
skills and attitudes. The items were arranged at random and coded
in the instrument to prevent the threat of set-response (Antonak &
Larrivee, 1997).
Scale Administration of the First Pilot-Test
Data from 267 college freshman students (189 female, 78 male),
age ranging from 15 to 23 years old (M = 16.96) was used in the first
Margarita Felipe-Fajardo
11
pilot test from January to February 2009. Of the 267 respondents,
101 were from Ateneo de Naga University, 89 from the University of
Nueva Caceres, and 77 from Universidad de Sta. Isabel.
Respondents were given the ISSA during regular class sessions
by their instructors. To ensure efficiency in the administration of the
instrument and prevent students from experiencing test-fatigue, the
279 items were divided in three inventory forms (1A, 1B, and 1C).
To encourage students to reveal their true study attitudes and
skills, it was emphasized in the cover letter that the confidentiality of
their responses are insured, that their answers will have no bearing
on their grade in the course, and that there are no right or wrong
answers in the survey. They were also advised to answer all items.
This was done to further reduce the chances of including incorrect
data in the analysis (Tam & Coleman, 2009). In this case, data with
missing answers were excluded in the analysis. Respondents were
also asked to comment on their perceived difficulties in understanding
the administration procedures, instructions or specific items in the
survey.
Results of the First Pilot-Test
To establish the reliability of the initial version, the Reliability
module of the Statistical Package for Social Sciences (SPSS, 2005)
was used. Items which tend to decrease the coefficient alpha statistic were identified and deleted from the inventory. The process was
repeated until a measure having a reasonably large coefficient alpha
was created (Item Analysis and Estimating Reliability Tutorial 8 ,
n.d.). Nunnaly (1978) suggested retaining items which have a minimum Cronbach’s alpha of .70. This process reduced the items to 171
in the second version of the ISSA.
Item-analysis results showed a Cronbach’s alpha coefficient of
.89 for Inventory 1A (44 items), .88 for Inventory 1B (42 items), and
.93 for Inventory 1C (48 items). Values of the Cronbach’s coefficient
alpha if item is deleted in the three forms ranged from .88 to .93.
Construction of the Second Version of the
ISSA
Construction of the second version of the ISSA was based on
both statistical results and respondents’ feedback on the first pilot
test.
12
Inventory of Study Skills and Attitudes
Results of the reliability analysis revealed that of the initial 279-item inventory consisting of both positively and negativelyworded items, 134 items proved to be the most reliable in the three
inventory forms for the first pilot test. Of these 134 items, only seven
were negatively-worded. This result revealed a common problem reported in literature in which reverse-scored items in questionnaires
tend to have the worst psychometric property, lowest correlations,
and lowest factor loadings probably because respondents, especially
those with low reading proficiency (Boss & Strietholt, 2009), find it
hard to process items worded negatively leading to errors in their answers (Weems, Onwuegbuzie, & Collins, 2006). Negatively-worded
items also do not seem to provide consistent information because
they may not be fully equivalent to their positively-worded counterparts (Weems, Onwuegbuzie, Schreiber, & Eggers, 2003). Thus, to
save some of the items in the first version, some negatively-worded
items in the initial inventory were changed to their positive counterparts (e.g., “I throw away past quizzes or test papers as soon as they
are returned by the instructor” was changed to “I keep quizzes or
test papers returned by my instructor to serve as review materials
for the exam”).
Students’ feedback on the inventory was also considered in the
revision. Common feedback included decreasing the number of items
in the questionnaire, and reducing the number of frequency points
in the Likert scale since some respondents reported being unable to
discriminate the differences among the options. The initial 6-point
Likert scale was then reduced to a 5-point scale. Another suggestion
was to clarify some items either by giving examples or simplifying
the words used. To facilitate students’ understanding of each item,
this study computed the Flesch’s Reading Ease Score (ranging from
0-100) for each item using the program of The Accessibility Institute
of the University of Texas at Austin (2009). With the exception of
11 items, most items in the second version have readability scores
of at least 60, considered as the standard readability level for adults
(RFP Evaluation Centers, n.d.). Respondents also suggested adding
items on other factors which affect their study skills.
The retention of the 134 reliable items, the changing of some
negatively-worded statements to their positive counterparts, and the
consideration of respondents’ feedback resulted in the development of
the second version of the ISSA consisting of 171 all positively-worded
Margarita Felipe-Fajardo
13
items in the following domains: Concentration (12), Class Participation (19), Examination (17), Memory (18), Motivation (19), Notetaking (17), Reading (17), Stress Management (17), Time Management (15), and Writing (20). The 171 items were divided into three
survey forms (2A, 2B, 2C) of 60, 62, and 49 items, respectively, and
asked students to rate the frequency of their use of study skills in a
five-point scale.
Scale Administration of the Second Pilot-Test
The second version of the ISSA was administered to first year
college students enrolled in the summer term of 2009 from two universities through their course instructors. As in the first pilot test,
respondents were asked to give demographic information, answer the
BSDS, read the instructions on how to answer the inventory, and
give suggestions or comments on how the survey could be further
improved.
Of the 405 students who completed the questionnaire, only
301 first-year college students of Ateneo de Naga University (150)
and Universidad de Sta. Isabel (151) were used for the study (37.2%
males and 62.7% females, whose age ranged from 16 to 25; M = 17.05;
SD = 1.11). Respondents whose year-level, data completion, and
BSDS score fell short of the set criteria were not included in the
analysis.
Results of the Second Pilot-Test and Further Refinement of the ISSA
To improve the internal consistency of the second version, item
analysis using the Scales procedure in the SPSS was again run during
the second pilot-testing. This time, however, the focus was on the
values of the corrected item-correlations. If the correlation is low, it
means that the item is not really measuring what the questionnaire
is trying to measure (Sherry, 1997). Robinson, Shaver, Wrightsman,
and Andrews (1991) suggested that items with corrected item-total
correlation less than .30 are to be deleted. However, to ensure approximate representation of items from the ten domains of study
skills, only items exhibiting the critical cut-off point of .40 item-total
correlations were retained (Chachamovich, Fleck, Trentini, Laidlaw,
& Power, 2008). Also, when two items of almost the same idea
yielded high item-total correlation values, the item with the higher
14
Inventory of Study Skills and Attitudes
correlation value was kept so as not to over-represent particular domains.
To determine the contribution of each item to internal consistency, the Cronbach’s alpha for the entire measure was estimated
along with the resulting alpha value if each item is deleted. Items
which contributed to the maximization of the Cronbach’s alpha values were retained.
Results of the item-analysis of both inventories 2A and 2B revealed that the domains with the least-represented items were Writing (9), Memory (8), Reading (7), Class Participation (5), and Stress
Management (5). To ensure that the third version contained items
that approximately represented the ten domains, Inventory 2C was
created containing additional items in these least-represented areas.
This was done to follow Cronbach’s (1979) advice to ensure that test
items represent the domains of a particular construct in the pilottesting stage.
Respondents’ comments in the second pilot test were also considered in the revision. They suggested adding more items on other
areas of study skills, making the inventory shorter, further clarifying
the ideas in the statements, and deleting redundant items.
Results show that inventory forms 2A, 2B, and 2C have an
internal consistency alpha of .95, .93, and .96, respectively. Corrected
item-total correlations ranged from .49 to .64 for Inventory 2A, .48
to .72 for Inventory 2B, and .45 to .69 for Inventory 2C.
Table 1 summarizes the distribution of items included in the
third version: Concentration (n = 8, 9.52%), Class Participation (n =
6, 7.14%); Examination (n = 8, 9.52%), Memory (n = 12, 14.29%),
Motivation (n = 10, 11.9%), Note-taking (n = 7, 8.33%), Reading
(n = 7, 8.33%), Stress Management (n = 8, 9.52%), Time Management (n = 7, 8.33%), and Writing (n = 11, 13.10%). This 84-item
inventory is the third version of the ISSA.
Administration of the ISSA to a Large Sample
The third version of the ISSA consisting of 84 items was administered to a large sample of 941 students within the second semester
of school year 2009-2010 with only 916 data used for the study.
Data from respondents who had missing answers or who were not
freshman students were not included in the analysis. Of the 916
respondents, 416 (45.4%) came from Ateneo de Naga University,
15
Margarita Felipe-Fajardo
Table 1
Composition of Items in the Third Version of the ISSA
Item No. in the
3rd Version
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
From Inventory/
Study Skill/
Item No.
Attitude Domain
2A/01
TM
2B/32
CN
2A/31
TM
2A/48
CN
2A/46
TM
2A/34
CN
2A/24
MM
2A/02
CN
2C/45
MM
2A/47
TM
2A/21
TM
2C/12
WR
2B/37
NT
2B/43
MT
2B/18
MT
2B/35
EX
2B/44
MT
2C/47
MM
2C/31
SM
2C/22
CP
2C/05
RD
2A/04
MM
2A/37
MT
2B/17
WR
2A/45
TM
2A/25
NT
2B/62
MT
2C/14
WR
2C/24
CP
2C/33
SM
2C/43
MM
2C/16
WR
2C/03
RD
2A/26
NT
2B/42
WR
2A/49
EX
2A/38
CN
2A/56
MT
2C/39
SM
Corrected ItemTotal Correlation
.600
.602
.509
.571
.541
.525
.513
.492
.628
.533
.621
.543
.511
.488
.480
.491
.565
.568
.502
.564
.453
.528
.637
.722
.556
.544
.517
.686
.527
.668
.534
.558
.504
.523
.523
.704
.509
.587
.642
16
Inventory of Study Skills and Attitudes
Table 1
(continuation)
Item No. in the
3rd Version
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
From Inventory/
Study Skill/
Item No.
Attitude Domain
2B/27
WR
2B/12
CN
2A/59
MT
2B/04
CN
2A/18
WR
2A/36
EX
2B/28
MT
2A/57
CN
2B/53
NT
2A/05
NT
2A/60
EX
2B/21
CN
2B/9
EX
2B/31
MM
2B/58
MT
2A/52
NT
2C/01
RD
2C/15
WR
2C/21
CP
2C/49
MM
2C/28
MM
2B/54
NT
2B/41
EX
2B/25
RD
2B/34
EX
2C/02
RD
2C/07
RD
2C/19
WR
2C/26
CP
2C/38
SM
2C/41
MM
2C/32
SM
2C/13
WR
2A/32
SM
2A/50
EX
2C/46
MM
2C/36
SM
2C/18
WR
2C/42
MM
Corrected ItemTotal Correlation
.530
.525
.495
.512
.631
.524
.498
.616
.514
.539
.502
.469
.520
.552
.522
.519
.483
.583
.461
.637
.582
.619
.480
.525
.485
.472
.524
.675
.608
.573
.632
.653
.562
.578
.541
.669
.558
.677
.690
17
Margarita Felipe-Fajardo
Table 1
(continuation)
Item No. in the
3rd Version
79
80
81
82
83
84
From Inventory/
Study Skill/
Item No.
Attitude Domain
2C/23
CP
2C/08
RD
2C/48
MM
2B/46
TM
2A/55
MT
2C/34
SM
Corrected ItemTotal Correlation
.673
.631
.658
.507
.634
.654
CN = concentration, CP = class participation, MM = memory, MT =
metacognition, NT = note-taking, RD = reading, SM = stress management,
TM = time management, EX = examination, WR = writing
374 (40.8%) from Universidad de Sta. Isabel, and 126 (13.7%) from
the University of Nueva Caceres. The sample included 37.7% males
(n = 346) and 62.2% females (n = 570) with ages ranging from 15
to 22 (M = 16.63, SD = 1.49). These freshman students represented the fields of arts (n = 81, 8.8%), business (n = 144, 15.7%),
computers (n = 120, 13.1%), education (n = 94, 10.2%), engineering
(n = 101, 11.0%), health (n = 314, 34.2%), science (n = 55, 6.0%), and
other courses (n = 7, 0.8%).
As in the last survey form, respondents were asked to complete
the demographic information before rating the frequency of their use
of a study skill or attitude. This time, however, the survey form
did not include the BSDS or ask students’ suggestions for revision.
The instructors asked students to complete the survey inside the
class, collected the completed questionnaire, and returned them to
the researcher.
Results
Factor Structure of the Final Instrument
To examine the factor structure underlying the 84-item version
of the ISSA, an exploratory factor analysis was used. Results of the
sampling adequacy measure of this study reported a high KMO of
0.966 and Bartlett’s test of sphericity (χ2 = 27480.380, degrees of
freedom = 3486) yielded a statistically significant p < .000 which
18
Inventory of Study Skills and Attitudes
indicated that the variables were related and therefore suitable for
structure detection.
For factor extraction, this study used the Maximum Likelihood
(ML) method to simplify the interpretation of the factors (Costello &
Osborne, 2005; SPSS, 2005) followed by the varimax with Kaiser normalization for rotation. Kaiser (1958) suggested the use of varimax
rotation especially for exploratory factor analysis since varimax tries
to minimize the number of variables that load highly on a factor using
the orthogonal assumption (Stanek, 1993). Maximizing the varimax
function will ensure that any tendency toward a general factor in the
solution will be minimized. This rotation method is appropriate for
the present study since many factors could be underlying most of the
items in the inventory (Gorsuch, 1974).
Several criteria were followed to identify the potential factors
to extract. First, in using the Kaiser (1960) criterion the study initially considered retaining factors with eigenvalues greater than 1.
Initial factor extraction yielded 18 factors with eigenvalues exceeding 1, however, upon ML extraction, only 3 factors complied with
the Kaiser criterion whereas varimax rotation suggested retaining 9
factors. Next, Cattell’s (1966) proposal to examine the scree plot
was followed, which presented a three-factor solution.
The factor loading of an item was also considered. In interpreting the rotated factor pattern, an item was said to load on a factor
if the loading was .30 or greater for the main factor and less than
.30 for the remaining factors (Rogers & Hanlon, 1996). Tabachnick
and Fidell (2001), however, propose that the minimum loading of
an item should be .32. If there are “cross-loaders,” items that load
at .32 or higher on two or more factors, then the items should be
dropped especially if there are several strong-loader items, those
which load at .50 or better on each factor. To increase meaningful interpretations and to avoid over-factoring, the analysis chose to
retain items which mainly loaded on one factor and with a factor
loading of at least .45, considered as a fair factor loading for interpretation (Engelberg, Downey, & Curtis, 2006). The analysis also
dropped factors which had fewer than three items, which indicates
an unstable factor (Costello & Osborne, 2005).
To confirm the scree plot suggestion of a three-factor solution,
the next step was to evaluate whether the items loaded strongly
on one factor. Using Tabachnick and Fidell’s (2001) criterion on
Margarita Felipe-Fajardo
19
retention of items, the rotated factor matrix shows that of the 84
items, 8 items (09, 21, 22, 35, 37, 52, 58, and 75) failed to load on
any factor. Sixteen items (01, 02, 04, 06, 13, 24, 29, 31, 40, 53, 56,
61, 63, 64, 74, and 79) were cross-loaders, while the rest of the 60
items mainly loaded on one factor.
Table 2 presents the items with a minimum of .45 factor magnitude.
Based on Table 2, although there were 6 items which loaded
strongly on other factors aside from the first three, these factors
had to be dropped because they contained items fewer than three
(Costello & Osborne, 2005). The factor-rotation clearly supports
the three-factor solution suggested by the scree plot. The initial 84item questionnaire was then reduced to 28 items with three factors
accounting for 24.60% of the variance.
Once the factors were extracted and identified, a descriptive
name for each factor was assigned by three field experts. In this case,
a general factor label was created based on the evaluators’ description
of the commonality of the items for each factor.
Factor 1, consisting of 11 items and accounting for 9.83% of the
variance, was named Meaningful Learning because the items reflected
students’ desire to focus on the importance of the task at hand enabling them to find meaning in what they are doing. Items included
in this factor suggest strategies that reflect a deep approach to learning as shown by the three highest loading items (connecting new with
previous learning, relating reading material to own experience, and
finding practical use of learning in one’s life). This factor contained
a mix of items from the initial domains of Class Participation (3),
Reading (3), Memory (2), Motivation (1), Stress Management (1),
and Writing (1). Results of the item-means show that of the 11
strategies, the respondents tend to use more often the strategies of
keeping an open mind in class discussions (M = 3.93) and following
instructors’ feedback to improve writing skills (M = 3.84).
Factor 2, consisting of 7 items and accounting for 7.42% of the
variance, was named ‘Self-Regulation.’ The evaluators seemed to
agree that the items in this factor suggest the ability to control and
manage the learning situation to produce a desired outcome as evidenced by the three highest-loading items (finishing required projects
despite their difficulty, thinking of ways to do better in school, and
thinking positively during examinations). Items comprising this fac-
20
Inventory of Study Skills and Attitudes
Table 2
Strong-Loading Items after Factor Rotation
Item
05
06
11
12
14
15
16
17
18
20
23
26
34
38
39
46
49
50
52
54
55
65
66
68
69
70
74
77
79
80
81
82
83
84
a Items
1
2
3
.502a
.483a
4
Factor
5
6
7
8
9
.480a
.582
.496a
.539a
.496a
.532a
.501
.528
.518a
.470a
.493a
.455a
.542
.551a
.565a
.466a
.553
.550a
.481a
.481a
.581a
.478a
.475a
.597a
.480
.486a
.503a
.501a
.516a
.572a
.536a
.502a
included in the final 28-item inventory
Extraction method: maximum likelihood; rotation: varimax with Kaiser
normalization
Margarita Felipe-Fajardo
21
tor come from the domains of Motivation (2), Examination (2), Time
Management (1), Note-taking (1), and Stress Management (1). The
distinct Filipino culture of being religious was especially revealed in
the high mean value (4.48) of the item, “I pray to God [...] when
problems seem too big to handle.” Of the three factors, respondents
show the highest frequency of items in this factor which indicate that
respondents in this study use frequently self-regulating strategies to
finish their academic tasks.
Factor 3, consisting of 10 items and accounting for 7.36% of the
variance, was named Planning & Organization. The items predominantly describe strategies which put structure on handling of tasks
or a systematic way of planning towards fulfilment of academic goals
as revealed by the two highest-loading items (preparing schedule for
the coming week and making an outline of the lecture). These items
represent the initial domains of Motivation (4), Time Management
(2), Note-taking (2), Concentration (1) and Examination (1). The
item “Before I begin a task I ask myself if doing this would achieve
my goal” exhibited the highest item-mean value (3.28).
Table 3 summarizes the items included in the three factors.
The total variance accounted for by the three factors was 24.60%,
9.8% of which was accounted for by the Meaningful Learning Subscale and 7.4% each for the Self-Regulation and Planning & Organization subscales.
Psychometric Properties of the Final
Instrument
Reliability estimates for the final instrument were established
based on each item’s communality, internal consistency, and corrected item-total correlation. (See Table 4.)
Communality
Communalities determine the proportion of variance that each
item has in common with other items (StatSoft, Inc., n.d.). When
the extracted communalities yield high values, there is a strong suggestion of the internal consistency of the factors (Tam & Coleman,
2009). However, communalities must be interpreted in relation to
the interpretability of the factors. What is critical is not the communality coefficient itself but rather the extent to which the item
plays a role in the interpretation of a factor. A communality value
22
Inventory of Study Skills and Attitudes
Table 3
Factor Loadings, Eigenvalues, Percentages of Variance, Mean and
Standard Deviations of the 28-item ISSA
Domain
MT
CP
RD
RD
CP
SM
MM
WR
CP
RD
MM
ISSA Statements
FL
M
Factor 1: Meaningful Learning
Eigenvalues: 8.2; Percentage of Variance: 9.8%
54. I try to find practical use in my life of .550 3.66
the things I am learning in each subject.
55. Through my instructor’s body language .481 3.68
and tone of voice, I know how he/she
feels about the topic being discussed.
65. I go over charts and pictures included in .481 3.69
the text that I read to be more familiar
with the topic.
66. I try to relate to my own experience the .581 3.77
things that I read to understand it better.
68. In class, I give more attention to what .478 3.67
the speaker says and not on the way
he/she says it (e.g., mispronunciation,
etc.)
69. I think about the events which cause me .475 3.65
stress and try to avoid them as much as
possible.
70. I try to connect new learning with what .597 3.59
I have previously learned to remember
information (e.g., relate theory of supply and demand with market prices).
77. I use my instructors’ feedback on my pa- .486 3.84
pers to improve my writing skills.
79. I try to listen and keep an open mind .503 3.93
when my professor or classmate shares
an opinion which is different from mine.
80. After reading a text I reflect on what I .501 3.69
have learned.
81. I try to find meaning in the information .516 3.75
that I want to remember instead of just
memorizing them.
SD
0.869
0.956
0.884
0.876
0.911
0.869
0.865
0.928
0.871
0.890
0.905
23
Margarita Felipe-Fajardo
Table 3
(continuation)
Domain
TM
EX
NT
MT
EX
MT
SM
ISSA Statements
FL
M
Factor 2: Self-Regulation
Eigenvalues: 6.2; Percentage of Variance: 7.4%
11. I make sure that I submit assignments .480 4.01
or projects on or before the deadline.
23. Before taking a test, I tell myself that I .518 3.88
will do well on this exam.
26. I include in my notes the examples given .470 3.87
by the lecturer to explain the points discussed in class.
46. I try to finish required projects even if .551 3.91
they are not enjoyable to do.
50. I keep returned quizzes or test papers to .466 3.91
serve as review materials for the exam.
83. I think of ways to do better in school.
.536 4.02
84. I pray to God to unburden myself when .502 4.48
problems seem too big to handle.
SD
0.960
0.969
0.957
1.01
1.07
0.901
0.891
may be low but may be meaningful if the item is contributing to a
well-defined factor (Garson, 2008).
In the Meaningful Learning subscale, although item 68 (giving
more importance to the message of the speaker than on the manner of delivery) had a low communality value of .296, this item was
retained because the item contributes to the interpretability of the
factor. Also, when the item was deleted from the factor, the reliability coefficient alpha of the subscale did not increase (α = .855) which
indicated that the item contributes to the increase in the internal
consistency coefficient of the scale.
Internal Consistency
Reliability estimates through the Cronbach’s coefficient alpha
were also computed for the entire scale as well as for its subscales.
The ISSA revealed an excellent alpha coefficient of .90 for the entire
test and adequate reliability α values for each of its three subscales:
Factor 1 (α = .86, n = 11), Factor 2 (α = .79, n = 7) and Factor
3 (α = .81, n = 10). According to Carmines and Zeller (1979), an
instrument with α ≥ .80 is considered good and reliable.
24
Inventory of Study Skills and Attitudes
Table 3
(continuation)
Domain
TM
CN
MT
MT
EX
MT
NT
MT
NT
TM
ISSA Statements
FL
M
Factor 3: Planning & Organization
Eigenvalues: 6.2; Percentage of Variance: 7.4%
05. I make a master schedule of fixed ac- .502 3.19
tivities for the whole semester (e.g.,
schedule of classes and examination, due
dates of projects, etc).
06. I let my house or roommates know my .483 2.99
quiet hours of study when I cannot be
disturbed.
14. I monitor my progress (e.g., recording .496 3.13
quiz or exam scores) in each of my subjects.
15. Before I begin a task I ask myself, “Will .532 3.28
doing this help me achieve my goal?”
16. When taking an examination, I check .496 3.18
first the entire exam and plan how much
time I should spend answering each
part.
17. I do further readings on my own or an- .532 3.03
swer more exercises even if my instructors do not require them.
34. When taking notes from a textbook, I .493 2.99
write on the top page of my notebook
the date, topic, and page numbers of my
source.
38. I come to class prepared, having read .455 3.25
the assigned text or answered the homework.
49. I make an outline of the day’s lecture .565 2.99
complete with headings and subheadings in each of my subjects.
82. Each Sunday, I prepare my schedule for .572 3.17
the coming week.
SD
1.12
1.16
1.04
1.03
1.04
0.963
1.13
0.858
1.08
1.19
FL = factor loading, M = mean, SD = standard deviation. Extraction
method: maximum likelihood; rotation method: varimax with Kaiser normalization. Rotation converged in 23 iterations.
25
Margarita Felipe-Fajardo
Table 4
Internal Consistency Reliability, Item Analysis, and Communality
(h2 ) of the 28-item ISSA
Itemtotal r
h2
.860
.846
.586
.499
.851
.521
.424
.850
.528
.428
.846
.590
.475
.855
.466
.296
.852
.510
.367
.846
.591
.494
.848
.560
.437
.847
.578
.494
.847
.576
.508
.847
.577
.416
.787
.770
.458
.411
.757
.524
.420
Items
α
Factor 1: Meaningful Learning
54. I try to find practical use in my life of the
things I am learning in each subject.
55. Through my instructor’s body language and
tone of voice, I know how he/she feels about
the topic being discussed.
65. I go over charts and pictures included in the
text that I read to be more familiar with the
topic.
66. I try to relate to my own experience the
things that I read to understand it better.
68. In class, I give more attention to what the
speaker says and not on the way he/she says
it (e.g., mispronunciation, etc.)
69. I think about the events which cause me
stress and try to avoid them as much as possible.
70. I try to connect new learning with what I
have previously learned to remember information (e.g., relate theory of supply and demand with market prices).
77. I use my instructors’ feedback on my papers
to improve my writing skills.
79. I try to listen and keep an open mind when
my professor or classmate shares an opinion
which is different from mine.
80. After reading a text I reflect on what I have
learned.
81. I try to find meaning in the information that I
want to remember instead of just memorizing
them.
Factor 2: Self-Regulation
11. I make sure that I submit assignments or
projects on or before the deadline.
23. Before taking a test, I tell myself that I will
do well on this exam.
26
Inventory of Study Skills and Attitudes
Table 4
(continuation)
Items
α
26. I include in my notes the examples given by
the lecturer to explain the points discussed in
class.
46. I try to finish required projects even if they
are not enjoyable to do.
50. I keep returned quizzes or test papers to serve
as review materials for the exam.
83. I think of ways to do better in school.
84. I pray to God to unburden myself when problems seem too big to handle.
Factor 3: Planning and Organization
05. I make a master schedule of fixed activities for the whole semester (e.g., schedule
of classes and examination, due dates of
projects, etc).
06. I let my house or roommates know my quiet
hours of study when I cannot be disturbed.
14. I monitor my progress (e.g., recording quiz or
exam scores) in each of my subjects.
15. Before I begin a task I ask myself, “Will doing
this help me achieve my goal?”
16. When taking an examination, I check first the
entire exam and plan how much time I should
spend answering each part.
17. I do further readings on my own or answer
more exercises even if my instructors do not
require them.
34. When taking notes from a textbook, I write
on the top page of my notebook the date,
topic, and page numbers of my source.
38. I come to class prepared, having read the assigned text or answered the homework.
49. I make an outline of the day’s lecture complete with headings and subheadings in each
of my subjects.
82. Each Sunday, I prepare my schedule for the
coming week.
Cronbach’s coefficient α of the entire ISSA
h2
.760
Itemtotal r
.511
.463
.753
.548
.549
.763
.503
.431
.751
.764
.560
.492
.545
.416
.811
.794
.501
.523
.797
.473
.406
.798
.458
.366
.792
.519
.420
.799
.453
.338
.794
.499
.440
.796
.482
.363
.796
.485
.495
.791
.525
.490
.792
.516
.486
.904
Margarita Felipe-Fajardo
27
Item-Total Correlation
Correlation between each item and the total score was computed for the 28-item ISSA and each of its subscales. In the third
phase of the study, the study followed Robinson et al.’s (1991) suggestion not to include items with corrected item-total correlation less
than .30. Results showed that no item exhibited item-total correlation below this value, thus all 28 items were retained. The corrected
item-total correlations ranged from .38 to .59 for the entire scale,
.47 to .59 for the Meaningful Learning subscale, .46 to .56 for the
Self-Regulation subscale, and .45 to .52 for the Planning & Organization subscale. These values further strengthened the psychometric
quality of the instrument.
Summary, Implications, Recommendations,
and Conclusion
Factor Structure and Psychometric Properties
of the ISSA
The Inventory of Study Skills and Attitudes (ISSA) is a 28-item
self-report inventory that assesses students’ use of effective study
strategies and attitudes. Items are rated using a 5-point Likert scale
ranging from (1) never to (5) always. Higher score indicates frequent
use of study skills or adoption of a positive attitude towards learning.
The ISSA underwent three revisions before its final version. Using
item-analysis, the first version of 279 items was reduced to 171 in
the second version and 84 in the third. Exploratory factor analysis of the third version of the instrument revealed that the ISSA is
multi-dimensional consisting of three constructs in its final version of
28 items. Good face and content validity was achieved by the ISSA
through the evaluation of the experts. Modest construct validity was
achieved on the final solution through the examination of the communalities. Caution is warranted, however, because this three-factor
solution only accounted for 24.6% of the variance which indicates
that there were many unexplained variations. Overall, the results of
reliability and validity estimation suggest that the ISSA has achieved
excellent reliability of .90 for the entire inventory, and moderate to
high internal consistency Cronbach’s alpha of .86 for the Meaningful
Learning subscale, .79 for the Self-Regulation subscale, and .81 for
the Planning & Organization subscale.
28
Inventory of Study Skills and Attitudes
Implications of the Study
The factor structure of the ISSA has several important implications for both instructors and students. In the light of the findings, instructors should select instructional strategies that encourage
students to develop meaningful learning approaches such as activating schema, concept-mapping, discovery learning, problem-solving,
critical thinking (Hummel, 1997), and integrative learning (Laird,
Niskodé-Dossett, & Kuh, 2009). Moreover, instructors should reflect whether their assessments are overemphasizing rote learning as
they merely encourage memorization but not deep understanding
of the material. This also means that instructors may do well to
spend extra time in class to teach students not only of explicit cognitive study strategies but also meta-cognitive ones to aid students in
the planning, monitoring and evaluation of their learning outcomes.
To encourage self-regulation, educators should provide students with
opportunities for some choice and control over their learning. When
students’ choices and decisions are respected, they are better able to
discriminate which strategies work for them, thus leading toward a
more relevant learning.
The students, on the other hand, may benefit from an evaluation of their goals, motivations, and beliefs about university education. Unless students realize that they are responsible for their own
learning, university education will be an aimless journey. Reflecting
on, integrating, and applying their learning will ensure that they are
not only passive recipients of knowledge but producers of knowledge
as well (Jones, Valdez, Nowakowski, & Rasmussen, 1994).
Limitations and Recommendations
While the ISSA’s reliability and initial validity were established
in this study, there are important limitations to these findings.
First, this study used freshman university students as respondents. In future research, perhaps other groups not sampled in this
study might bring different perspectives and responses to the study
and shed light on whether item and scale variability increase with
diverse samples.
Second, the scales were identified statistically following the
completion of data collection, thus validity analysis was conducted
using only those items included in the item pool. Additional valid-
Margarita Felipe-Fajardo
29
ity analyses adding selected items not covered by the final inventory
would be useful to confirm these findings. Furthermore, additional
validation studies will better define and clarify whether the labels for
the factors in this study are accurate and appropriate, and whether
this instrument correlates highly with academic achievement through
measures of students’ quality point index.
The ISSA has also not been tested for a number of other important measurement characteristics such as stability (test-retest) and
responsiveness (change over time). These are necessary measurement
characteristics that may be evaluated and assessed in future studies.
Lastly, self-report procedure, often tinged with social desirability bias, is not the only method appropriate in assessing students’
study skills. The assessments of students’ study strategies may be
better addressed by think-aloud procedures or actual observation by
researchers to validate students’ report of their use of strategies.
Conclusion
This study has established the reliability and initial validity
of the Inventory of Study Skills and Attitudes (ISSA). It has the
potential for use as a diagnostic and prescriptive tool to measure a
student’s use of study skills and attitudes toward effective learning
in three constructs. It is not meant to be used, however, as determinant for success or failure in college. Academic counselors and instructors may use the inventory to determine students’ present study
strategies and use such information as baseline for strategy instruction or training. Additionally, instructors could use the inventory to
measure students’ progress at multiple points. Freshman students
themselves may complete a self-report of the ISSA in the beginning
of their first semester in college and use that information to identify
strengths and areas for improvement. There is reason for cautious
optimism at this point of the study that the ISSA is a valid and
reliable instrument contextualized in the experience of the Filipino
college student. It is hoped that this instrument will be valuable
to educators and students alike who place much value in university
education.
References
Allgood, W. P., Risko, V. J., Alvarez, M. C., & Fairbanks, M. M. (2000).
Factors that influence study. In R. F. Flippo & D. C. Caverly (Eds.),
30
Inventory of Study Skills and Attitudes
Handbook of college reading and study strategies (pp. 201–210). Mahwah, NJ: Lawrence Erlbaum Associates.
Antonak, R. F., & Larrivee, B. (1997). Psychosometric analysis and revision
of the opinions relative to mainstreaming scale. Exceptional Children,
62 , 139–149.
Ashmore, R. D., Del Boca, F. R., & Bilder, S. M. (1995). Construction and
validation of the gender attitude inventory: A structured inventory to
assess multiple dimensions of gender attitude. Sex Roles: A Journal
of Research, 32 , 753–760.
Balduf, M. (2009). Underachievement among college students. Journal of
Advanced Academics, 20 , 274–294.
Boss, W., & Strietholt, R. (2009, September). On the validity of negatively worded items in a PIRLS 2006 context (symposium 852).
Retrieved November 26, 2009, from http://www.eera-ecer.eu/
ecer-programmes-and-presentations/conference/ecer-2009/
contribution/2373/?no cache=1
Carmines, E. G., & Zeller, R. A. (1979). Reliability and validity assessment
(13th ed.). Newbury Park, CA: Sage.
Cattell, R. B. (1966). The scree test for the number of factors. Multivariate
Behavioral Research, 1 , 245–376.
Caverly, D. C., & Flippo, R. F. (Eds.). (2000). The handbook of college
reading and study strategy research. Mahwah, NJ: Lawrence Erlbaum
Associates.
Chachamovich, E., Fleck, M. P., Trentini, C. M., Laidlaw, K., & Power,
M. J. (2008). Development and validation of the Brazilian version of
the Attitudes to Aging Questionnaire (AAQ): An example of merging
classical psychometric theory and the Rasch measurement model.
Health and Quality of Life Outcomes, 6 . Available from http://
www.hqlo.com/content/6/1/5
Cole, R. P., Goetz, E. T., & Willson, V. (2000). Epistemological beliefs
of underprepared college students. Journal of College Reading and
Learning, 31 , 60–70.
Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory
factor analysis: Four recommendations for getting the most from
your analysis. Practical Assessment, Research & Evaluation, 10 .
Available from http://pareonline.net/getvn.asp?v=10&n=7
Crandall, J., Jaramillo, A., Olsen, L., & Peyton, J. K. (2002). Using cognitive strategies to develop English language and literacy. Retrieved
from ERIC database.
Cronbach, L. J. (1950). Further evidence on response sets and test design.
Educational and Psychological Measurement, 10 , 3–31.
Cronbach, L. J. (1979). Essentials of psychological testing (3rd ed.). New
York: Harper & Row Publishers.
Cukras, G. G. (2006). The investigation of study strategies that maximize
Margarita Felipe-Fajardo
31
learning for underprepared students. College Teaching, 54 , 194–197.
Engelberg, R., Downey, L., & Curtis, J. R. (2006). Psychometric characteristics of a quality of communication questionnaire assessing communication about end-of-life care. Journal of Palliative Medicine, 9 ,
1086–1098.
Fasset, D. R. (2002). How can I help myself? Self-knowledge, self-advocacy
and academic success. Retrieved from ERIC database.
Garson, G. D. G. (2008). Statnotes: Topics in multivariate analysis. Retrieved January 07, 2010, from http://faculty.chass.ncsu.edu/
garson
Goetz, E. T., & Palmer, D. J. (1991). The role of students’ perceptions
of study strategy and personal attributes in strategy use. Reading
Psychology: An International Quarterly, 12 , 199–217.
Gorsuch, R. L. (1974). Factor analysis (1st ed.). Hillsdale, NJ: Erlbaum.
Grimes, S. K., & David, K. C. (1999). Underprepared community college students: Implications of attitudinal and experiential differences.
Community College Review , 27 , 73–83.
Haghighat, R. (2005). The development of the Brief Social Desirability Scale (BSDS). Europe’s Journal of Psychology. Available
from http://www.ejop.org/archives/2007/11/the development
.html
Hummel, J. H.
(1997, May).
Meaningful learning.
Retrieved
April 6, 2010, from http://www.valdosta.edu/~jhummel/psy310/
!meaning.htm
Item analysis and estimating reliability tutorial 8.
(n.d.).
[Word document].
Retrieved February 3, 2010, from
http://mona.uwi.edu/spsw/downloads/coursemat/PS28C/
2008-2009/sem2/Tutorial%208%20-%20Item%20Analysis%20and%
20Estimating%20Reliability.doc
Jones, B., Valdez, G., Nowakowski, J., & Rasmussen, C. (1994). Designing
learning and technology for educational reform. Oak Brook, IL: Designing Learning and Technology for Educational Reform. Retrieved
April 6, 2010, from http://www.ncrel.org/sdrs/engaged.htm
Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor
analysis. Pyrometrical , 23 , 187–200.
Kaiser, H. F. (1960). The application of electronic computers to factor
analysis. Educational and Psychological Measurement, 20 , 141–151.
Kayler, H., & Sherman, J. (2009). At-risk ninth-grade students: A psychoeducational group approach to increase study skills and grade point
averages. Professional School Counseling, 12 , 434–439.
Kidwell, K. S. (2005). Understanding the college first-year experience. The
Clearing House, 78 , 253–255.
Kitsantas, A., Winsler, A., & Huie, F. (2008). Self-regulation and ability
predictors of academic success during college: A predictive validity
32
Inventory of Study Skills and Attitudes
study. Journal of Advanced Academics, 20 , 42–68.
Laird, T. F. N., Niskodé-Dossett, A. S., & Kuh, G. D. (2009). What
general education courses contribute to essential learning outcomes.
The Journal of General Education, 58 , 66–84.
Legault, L., Green-Demers, I., & Pelletier, L. (2006). Why do high school
students lack motivation in the classroom? toward an understanding
of academic amotivation and the role of social support. Journal of
Educational Psychology, 98 , 567–582.
Lizzio, A., & Wilson, K. (2004). First-year students’ perceptions of capability. Studies in Higher Education, 29 , 109–128.
Meneghetti, C., De Beni, R., & Cornoldi, C. (2007). Strategic knowledge and consistency in students with good and poor study skills.
European Journal of Cognitive Psychology, 19 , 628–649.
Merydith, S. P., Prout, H. T., & Blaha, J. (2003). Social desirability and
behavior rating scales: An exploratory study with the Child Behavior
Checklist/4-18. Psychology in the Schools, 40 , 225–235.
Moreno, A., & Martin, E. (2007). The development of learning to learn in
Spain. The Curriculum Journal , 18 , 175–193.
Nunnaly, J. (1978). Psychometric theory. New York: McGraw-Hill.
Petersen, R., Lavelle, E., & Guarino, A. J. (2006). The relationship between
college students’ executive functioning and study strategies. Journal
of College Reading and Learning, 36 , 59–67.
Phan, H. P. (2009). Amalgamation of future time orientation, epistemological beliefs, achievement goals, and study strategies: Empirical
evidence established. British Journal of Educational Psychology, 79 ,
155–173.
Piazza, C. L., & Siebert, C. F. (2008). Development and validation of a
writing dispositions scale for elementary and middle school students.
The Journal of Educational Research, 101 , 275–285.
Purdie, N., & Hattie, J. (1999). The relationship between study skills and
learning outcomes: A meta-analysis. Australian Journal of Education, 43 , 72–82.
RFP Evaluation Centers.
(n.d.).
Flesch reading ease readability score.
Retrieved December 17, 2009, from http://
rfptemplates.technologyevaluation.com/Readability
-Scores/Flesch-Reading-Ease-Readability-Score.html
Robinson, J. P., Shaver, P. R., Wrightsman, L. S., & Andrews, F. M. (1991).
Measures of personality and social psychological attitudes. San Diego,
CA: Academic Press.
Robyak, J. E. (1978). Study skills versus non-study skills students: A
discriminant analysis. The Journal of Educational Research, 71 , 161–
166.
Rogers, J. R., & Hanlon, P. J. (1996). Psychometric analysis of the college
student reasons for living inventory. Measurement and Evaluation in
Margarita Felipe-Fajardo
33
Counseling and Development, 29 , 13–24.
Sherry, L.
(1997).
Item analysis.
Retrieved November 26,
2009,
from
http://carbon.cudenver.edu/~lsherry/rem/
item analysis.html
Stanek, D. M. (1993). Factor analysis. Retrieved August 28, 2009, from
http://www.its.ucdavis.edu/telecom/r11/factan.html
Statistical Package for Social Sciences. (2005). SPSS for Windows evaluation version (Release 14.0). [Software]. Chicago: SPSS.
StatSoft, Inc. (n.d.). Electronic statistics textbook. Tulsa, OK: StatSoft. Retrieved December 4, 2009, from http://www.statsoft.com/
textbook/stfacan.html
Stewart, J., & Landine, J. (1995). Study skills from a metacognitive perspective. Guidance and Counseling, 11 , 16–20.
Stupnisky, R. H., Renand, R. D., Daniels, L. M., Haynes, T. L., & Perry,
R. P. (2008). The interrelation of first-year college students’ critical thinking disposition, perceived academic control, and academic
achievement. Research in Higher Education, 49 , 513–530.
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics.
Tam, D. M. Y., & Coleman, H. (2009). Construction and validation of
a professional suitability scale for social work practice. Journal of
Social Work Education, 45 , 47–63.
The Accessibility Institute of the University of Texas at Austin. (2009).
Txreadability: A multilanguage readability tool. Retrieved December
17, 2009, from http://webapps.lib.utexas.edu/TxReadability/
app
Tuckman, B. (2003, August). The “strategies-for-achievement” approach
for teaching study skills. In The annual conference of the American
Psychological Association. Toronto, ON.
Weems, G. H., Onwuegbuzie, A. J., & Collins, K. M. T. (2006). The role
of reading comprehension in responses to positively and negatively
worded items on rating scales. Evaluation and Research in Education,
19 , 3–20.
Weems, G. H., Onwuegbuzie, A. J., Schreiber, J. B., & Eggers, S. J. (2003).
Characteristics of respondents who respond differently to positively
and negatively worded items on rating scales. Assessment & Evaluation in Higher Education, 28 , 587–606.
Yip, M. C. W., & Chung, O. L. L. (2005). Relationship of study strategies and academic performance in different learning phases of higher
education in hong kong. Educational Research and Evaluation, 11 ,
61–70.
Young, D. B., & Ley, K. (2000). Developmental students don’t know that
they don’t know. Part I: Self-regulation. Journal of College Reading
and Learning, 31 , 54–64.
34
Inventory of Study Skills and Attitudes
Margarita Felipe-Fajardo ([email protected]) is an Assistant Professor in the Department of Literature and Language Studies at Ateneo
de Naga University. She earned her Master of Arts in Literature with a
specialization in literature and language teaching at Ateneo de Manila University and is currently studying towards a Doctor of Education degree at
the University of Wollongong in New South Wales, Australia. One of her
research interests is learner autonomy which explores ways on how to teach
students to learn how to learn. Her masters thesis studied the kinds of
strategies students use to learn English as a second language. This particular research identifies the most effective strategies Filipino students adopt
to cope with the demands of studying in higher education.
This article is based on a project funded by the University Research Council
(TP-URC-005 or TPP-1-2009-05).