Accreditation | Compliance | Institutional Effectiveness | Report List |
Institutional Strategy and Analysis:
Spring 2002 Self-Assessment Survey
Introduction
����������� In April 2002 UPA conducted
its annual Self-Assessment Survey.� The
survey asked respondents to assess the UPA web site in general as well as
the various types of data and reports (i.e., student records, student surveys,
faculty and staff reports, other miscellaneous reports) and planning and assessment
information available on the site.� The
survey also asked clients their satisfaction with ad hoc requests made of
UPA.� Finally, respondents were asked
to comment on how UPA could improve its services and to describe any future
needs they might have for UPA services.� A
copy of the survey
instrument is attached.
The sample consisted of a total of 322 potential UPA
clients from the Chancellor�s and Provost�s offices, College and Department
administrators, unit directors, and committee chairs.� The overall response rate was 36%, ranging from 22% for college
administrators to 56% for directors (see Table 1).� Given the relatively low number in the population and the low response
rate, the results need to be interpreted with caution, especially when looking
at differences between the separate groups of respondents.
Table 1: Response Rate Information
Title
Category |
Population |
Survey |
RR |
MOE |
||
N |
% |
N |
% |
% |
% |
|
Chancellors |
25 |
7.8% |
9 |
7.4% |
36.0% |
+20.9 |
College
Admin |
37 |
11.5% |
8 |
6.6% |
21.6% |
+27.2 |
Committee
Chairs |
32 |
9.9% |
12 |
9.9% |
37.5% |
+17.7 |
Department
Admin |
187 |
58.1% |
67 |
55.4% |
35.8% |
+7.7 |
Directors |
23 |
7.1% |
13 |
10.7% |
56.5% |
+18.2 |
Provosts |
18 |
5.6% |
6 |
5.0% |
33.3% |
+26.7 |
Subtotal |
322 |
100.0% |
115 |
95.0% |
35.7% |
+5.9 |
No
Title |
0 |
0.0% |
6 |
5.0% |
|
|
Total |
322 |
100.0% |
121 |
100.0% |
|
|
Frequency of Visits to UPA Web Site
����������� About half of those responding
to the survey said they had visited the UPA web site in the past year.�
Most of those visiting the site did so only once or twice a month at
most.� The biggest users, relatively
speaking, appear to be from the provost�s office, college administrators,
and unit directors.� Low users are from the chancellor�s office and department administrators.�
Those not visiting the UPA web site are most likely to say that they
don�t need the information on the site or that they are not aware of the site.
How often visited UPA Web site in last year
|
All |
Chanc. |
Col |
Com. Chair |
Dept |
Dir |
Prov |
|
n=112 |
n=9 |
n=8 |
n=11 |
n=65 |
n=13 |
n=6 |
Used total |
54% |
22% |
87% |
64% |
43% |
77% |
100% |
�� Once a week or more often |
8% |
. |
37% |
. |
. |
23% |
50% |
�� One or two times a month |
18% |
. |
25% |
9% |
15% |
38% |
33% |
�� Less than once a month |
28% |
22% |
25% |
55% |
28% |
15% |
17% |
Never Used total
|
46% |
78% |
12% |
36% |
57% |
33% |
0.0% |
�� Never: no need |
20% |
44% |
12% |
18% |
23% |
8% |
. |
�� Never: get UPA data from print
reports |
3% |
. |
. |
. |
5% |
. |
. |
�� Never: get data from non-UPA
sources |
5% |
11% |
. |
. |
5% |
15% |
. |
�� Never: did not know about |
15% |
22% |
. |
18% |
20% |
. |
. |
�� Never: other reason |
3% |
. |
. |
. |
5% |
. |
. |
Ratings of the UPA Web Site Overall (Tables 2 and 3)
����������� Those using the UPA web site
gave positive ratings to various dimensions of it.� Highest ratings were given to the logical organization
of the pages and speed (load time).� Although still rated positively by the majority of users, lower
ratings were given to the attractiveness of the site and ease of finding information.
Table 2: UPA Web Site Overall (Mean Ratings)
|
N |
Mean* |
Logical organization of the pages |
60 |
3.5 |
Format and organization of data tables |
62 |
3.3 |
Viewability |
61 |
3.4 |
Attractiveness |
61 |
3.2 |
Speed (i.e., load time) |
57 |
3.5 |
Download speed |
49 |
3.4 |
*based
on 4 point scale (�4�=�excellent,� �3� = �good,� �2�=�fair,� �1�=�poor�
Table
3: Navigation of UPA Web Site (Mean Ratings)
|
N |
Mean* |
Ability to find information |
60 |
3.2 |
Ability to extract information |
48 |
3.3 |
*based
on 4 point scale (�4�=�Very Easy,� �3� = �Easy,� �2�=�Somewhat Difficult,�
�1�=�Difficult�
Institutional
Research Data and Reports (Tables 4 and 5)
����������� UPA web-site users from all
administrative groups are most likely to report using enrollment reports (92%
overall), followed by student survey reports (75% overall) (Table 4).� About two-thirds of all users report using
credit hour reports (66%), degrees conferred reports (65%), and admissions
reports (62%).
There are some suggestive differences between university
offices� use of UPA data and reports on the web, although given the low numbers
of respondents these differences need to be interpreted with caution.�
College administrators tend to be among the heaviest users of most
reports, especially those on the �hidden� web pages.�
Department administrators are relatively high users of the student
records and student survey reports, but do not use the hidden web pages as
much as might be expected.� Those in the provost office are the most likely
to use the faculty and staff reports, and, along with committee chairs, the
NCSU organizational chart.
Table 4: Use of Institutional Research Web pages, by
title category
|
Percent using page (N using page) |
||||||
|
All |
Chanc. |
Col |
Com. Chair |
Dept |
Dir |
Prov |
Student Records: |
|
|
|
|
|
|
|
Admissions reports |
62% (37) |
33% (1) |
86% (6) |
50% (3) |
57% (16) |
70% (7) |
67% (4) |
Enrollment reports |
92% (56) |
67% (2) |
100% (7) |
83% (5) |
90% (26) |
100% (10) |
100% (6) |
Credit Hours reports |
66% (37) |
33% (1) |
86% (6) |
33% (2) |
73% (19) |
67% (6) |
60% (3) |
Degrees Conferred reports |
65% (39) |
33% (1) |
86% (6) |
17% (1) |
86% (24) |
40% (4) |
50% (3) |
Student Progress reports |
36% (21) |
33% (1) |
29% (2) |
. |
46% (13) |
33% (3) |
33% (2) |
Student Survey reports |
75% (46) |
. |
86% (6) |
67% (4) |
79% (23) |
80% (8) |
83% (5) |
Faculty and Staff reports |
43% (26) |
33% (1) |
43% (3) |
50% (3) |
43% (12) |
20% (2) |
83% (5) |
Other Reports: |
|
|
|
|
|
|
|
Funding Formula |
29% (17) |
33% (1) |
43% (3) |
17% (1) |
25% (7) |
33% (3) |
33% (2) |
NCSU Organizational Chart |
52% (32) |
33% (1) |
57% (4) |
83% (5) |
41% (12) |
50% (5) |
83% (5) |
"Hidden" Web pages: |
|
|
|
|
|
|
|
Internal data |
24% (14) |
. |
71% (5) |
. |
22% (6) |
11% (1) |
33% (2) |
Dept.-level survey data |
37% (22) |
. |
83% (6) |
17% (1) |
43% (12) |
20% (2) |
17% (1) |
Planning and Evaluation |
18% (10) |
. |
50% (3) |
. |
15% (4) |
12% (1) |
33% (2) |
�����������
Majorities of those accessing
the various UPA reports give positive ratings to the intelligibility and timeliness
of the reports (Table 5).� Admissions
reports received the most consistently high ratings, getting a 3.4 average
rating for both intelligibility and timeliness.� Enrollment reports also received an average
rating of 3.4 for intelligibility.� Users
gave relatively low intelligibility ratings to the funding formula (2.9) and
student progress reports (3.0), and relatively low ratings to the timeliness
of student survey reports (3.0) and department-level survey data on the hidden
web site (3.0).
Table 5: Use, Intelligibility and Timeliness of Institutional
Research Data and Reports (All Respondents)
|
Yes - Used % (N) |
Intelligibility: Mean* (N) |
Timeliness: Mean* (N) |
Student Records: |
|
|
|
�� Admissions |
62% (47) |
3.4 (40) |
3.4 (40) |
�� Enrollment |
92% (56) |
3.4 (58) |
3.2 (54) |
�� Credit Hours |
67% (37) |
3.2 (39) |
3.2 (37) |
�� Degrees Conferred |
67% (39) |
3.2 (38) |
3.2 (36) |
�� Student Progress |
36% (21) |
3.0 (21) |
3.2 (19) |
Student Survey Reports |
75% (46) |
3.2 (50) |
3.0 (48) |
Faculty and Staff Reports |
43% (26) |
3.2 (26) |
3.3 (27) |
Other Reports |
|
|
|
�� Funding formula |
29% (17) |
2.9 (18) |
3.2 (17) |
�� NCSU Organizational chart |
52% (32) |
3.3 (31) |
3.3 (27) |
"Hidden" Web Pages: |
|
|
|
�� Internal Data |
24% (14) |
3.3 (13) |
3.2 (12) |
�� Department-level survey data |
37% (22) |
3.2 (20) |
3.0 (20) |
� �Planning and Evaluation |
18% (10) |
3.2 (12) |
3.1 (11) |
*based
on 4 point scale (�4�=�excellent,� �3� = �good,� �2�=�fair,� �1�=�poor�)
����������� Respondents were asked to
describe in their own words the relevance of student records, student survey,
faculty and staff, and other miscellaneous reports or data to their unit�s
needs, and their primary reason for using any such reports.� A total of 34 respondents commented on the
relevance of student record reports and data (admissions, enrollment, credit
hours, degrees conferred, and student progress).� The majority indicated that the reports are important to their unit
(see attached).� Unit directors in
particular point to the importance of the reports in providing them the information
they need to perform their duties, such as in making budget projections, in
planning for software purchases, and in determining needs for fire safety
and campus police.� College and department
administrators tend to use the data for planning and assessment activities.�
Those few respondents indicating they did not use the student records
data either said it was because they did not need it or, in rare cases, that
they either could not find what they wanted or it was easier for them to get
what they needed from other sources.
����������� Slightly fewer respondents (24) commented
on the relevance and usefulness of student survey data and reports.�
Most commented on their use of the student survey data for program
review, assessment, and planning.� Several commented that the survey data does
not provide the level of detail they need, although the department-level data
they are seeking is in fact available on the web).� A few respondents also commented that there is too much survey data,
making it difficult to find what they need.
����������� Only 11 respondents (representing all
general offices except committee chairs) commented on the relevance and usefulness
of the faculty and staff reports.� Those
commenting vary in the extent to which they see the faculty and staff reports
as relevant and useful to their particular unit.� Among the various uses of the faculty and staff
reports are comparisons with peer institutions, monitoring affirmative action
programs, and FTE load.
����������� Comments on the relevance and usefulness
of the funding formula and NC State University organizational chart available
on the UPA web site were offered by 15 respondents in a range of offices.�
Relatively few of them, however, said either was especially relevant
or useful to their unit.� Several commented that the organizational chart
was a useful reference, while a few others used the funding formula for budget
projections and research proposals.
����������� Respondents offered a range of comments
on the UA hidden web pages.� Several
respondents noted that once they understood how to access the hidden web pages
they found the information very useful for things like planning and program
assessment.
����������� Respondents were asked to indicate
any reports they might find useful to have on the UPA web site that are not
currently provided.� Some respondents
noted that they get the additional information they need through ad hoc requests,
and don�t necessarily see they need to have such specific information on the
web.� Several, however did make specific
suggestions for additional reports, including:
Planning and Assessment Information (Table 6)
����������� At least one-third of all UPA web-site
users report using the various planning and assessment information areas asked
about, with compact planning being the most heavily used (59%).�
In general the intelligibility ratings for the planning and assessment
information is somewhat lower than those given for other data and reports
(see above).� Highest average intelligibility ratings were given to the enrollment
planning information (3.3) and lowest ratings to information on accreditation
(2.9).� The planning and assessment
information is generally rated as at least �somewhat useful,� with ratings
on the 3-point scale ranging from a high of 2.6 for compact planning information
and information on assessing programs, to 2.3 for information about accreditation.
Table 6: Use, Intelligibility and Usefulness of Planning
and Assessment Information (All Respondents)
|
Used % (N) |
Intelligibility: Mean* (N) |
Usefulness Mean** (N) |
Strategic Planning |
44% (25) |
3.1 (27) |
2.4 (25) |
Compact Planning |
59% (34) |
3.1 (36) |
2.6 (31) |
Enrollment Planning |
43% (25) |
3.3 (25) |
2.4 (21) |
Peers & Performance Measures |
36% (21) |
3.0 (23) |
2.4 (21) |
Assessing Programs |
35% (19) |
3.2 (18) |
2.6 (17) |
Accreditation |
14% (8) |
2.9 (11) |
2.3 (10) |
*based
on 4 point scale (�4�=�excellent,� �3� = �good,� �2�=�fair,� �1�=�poor�)
**
based on 3 point scale (�3�=�very useful,� �2�=�somewhat useful,� �1�=�not
at all useful�)
����������
� Respondents from a range of offices commented on how they have used
the planning and assessment information available on the UPA web site.�
Uses mentioned include:
No
respondents mentioned any planning and assessment information not currently
provided that they would like to see on the UPA web site.
Non-Routine Reports (Tables 7 and 8)
����������� Fewer than half of respondents (46%) indicated they had
requested a non-routine report from UPA.�
The provost office, college administrators, and unit directors were
most likely to have requested a non-routine report.�
Overall ratings for UPA�s responsiveness to such requests were positive
(3.5 on a 4-point scale).� However,
ratings vary by the office making an ad-hoc request, from an average of 4.0
from those in the provost and chancellor�s offices, to 3.1 from committee
chairs.
����������� Thirty percent of survey respondents said they made an ad-hoc
request in the past year for UPA data, analyses and/or interpretation not
available on the web (Table 8).� Slightly
fewer respondents made an ad-hoc request for planning (20%) or assessment
and program review data or assistance (20%).� Fewer requests were made for survey development
(12%) or external reporting (15%).� Regardless
of the type of ad hoc request, however, those making such requests gave consistently
high ratings to the intelligibility, timeliness, and format of the data delivered
by UPA.
Table
7: Evaluation of UPA�s responsiveness for �Non-Routine� Reports, by Title
|
All % (N) |
Chanc % (N) |
Col % (N) |
Com Chair % (N) |
Dept % (N) |
Dir % (N) |
Prov % (N) |
not
used/ not
rated* |
54% (44) |
67% (6) |
25% (2) |
73% (8) |
64% (42) |
31% (4) |
17% (1) |
used |
46% (74) |
33% (3) |
75% (6) |
27% (3) |
36% (24) |
69% (9) |
83% (5) |
|
|
|
|
|
|
|
|
UPA
Responsiveness** |
3.5 |
4.0 |
3.8 |
3.1 |
3.3 |
3.4 |
4.0 |
*
this includes respondents who either said they had not made a request for
a non-routine report, or that they did not have an opinion.
**
average rating among those requesting non-routine reports, based on 4 point
scale (�4�=�excellent,� �3� = �good,� �2�=�fair,� �1�=�poor�)
Table 8: Use, Intelligibility, Timeliness and Format
of Information from "Non-routine" Requests (All Respondents)
Special Requests and Customized Analysis: |
Used % |
Intelligibility Mean* (N) |
Timeliness Mean* (N) |
Format Mean* (N) |
UPA data, analyses and/or interpretation not available on our Web site |
30% (33) |
3.5 (29) |
3.5 (28) |
3.5 (29) |
Planning |
20% (22) |
3.5 (18) |
3.6 (18) |
3.5 (18) |
Assessment and program review |
20% (22) |
3.5 (19) |
3.4 (19) |
3.5 (19) |
Survey development |
12% (13) |
3.4 (9) |
3.6 (10) |
3.5 (10) |
External reporting |
15% (17) |
3.5 (16) |
3.5 (16) |
3.5 (16) |
average
rating among those requesting non-routine reports, based on 4 point scale
(�4�=�excellent,� �3� = �good,� �2�=�fair,� �1�=�poor�)
����������� Almost half of respondents offered comments on how UPA can
improve its services.� While several
complimented UPA staff and services, the vast majority, representing all offices
included in the survey, indicated that UPA needs to do a better job in letting
the campus community know about the data and services it provides.� Other specific reports to include on the web
site mentioned by respondents included:
Only
two respondents offered specific criticisms of UPA�s responsiveness to specific
requests, both indicating that limited staff time was apparently the reason.� Another respondent noted that it is difficult
to find UPA from the university home page.
����������� Respondents were asked to comment on any new issues they
anticipate coming up for which UPA might be able to provide information and/or
assistance.� Many of those commenting
offered fairly specific suggestions about data needed for assessment purposes,
such as for:
Others
commented on the need for