This site is best viewed in Firefox, Chrome, Safari, or EdgeX

Northern Essex Community College (NECC) established an Artificial Intelligence (AI) Task Force in Fall 2024 to address the growing impact of artificial intelligence on higher education. 

The Task Force includes faculty and staff from various departments across the college. The Task Force conducted two surveys: one for currently enrolled students and one for faculty and staff. The purpose of these surveys are to gather insights, understand perceptions, and inform decision making about how AI is impacting or could impact the institution. 

The surveys were open from 3/1/2025 to 3/31/2025. Of the 683 employees (as of March 1st), 95 responded indicating a 14% employee response rate. Of the 5,010 currently enrolled students, 300 responded, indicating a 6% response rate. View: Complete Survey Data to see major themes and quotes.

Section 1.1 includes findings from the employee survey, specifically the quantitative questions, such as multiple choice and Likert scales.
Section 1.2 includes findings from the employee survey, specifically themes from the open-ended questions.
Section 2.1 includes findings from the student survey, specifically the quantitative questions, such as multiple choice and Likert scales.
Section 2.2 includes findings from the student survey, specifically themes from the open -ended questions.

NECC AI Survey 2025

Student Survey Analysis Open-Ended Question

Employee Survey Analysis Open-Ended

Section 1.1
1. AI Tool Usage: ChatGPT is the most used tool (41.5%), though a significant portion (21.5%) don’t use AI tools at all.

2. Implementation Needs: Employees identified clear policy guidelines (56.9%), best practices documentation (50%), and training programs (40%) as the most critical needs for effective AI implementation.

3. Training Preferences: Hands-on workshops on both campuses (46.2%) and online self-paced tutorials (40%) were the most requested training formats, showing interest in both in-person and flexible learning options.

4. Policy Priorities: Student use in coursework (50%), ethical guidelines (44.6%), and copyright/intellectual property concerns (31.5%) were identified as requiring immediate policy guidance.

5. Current AI Applications: Writing/editing documents (32.3%), research (20.8%), and email drafting (20.8%) are the most common current AI applications among employees.

Section 1.2
1. Widespread but conflicted AI adoption: There’s a significant range in how employees engage with AI – from enthusiastic adopters using it for content creation, educational resources, and administrative tasks to those expressing strong resistance. Many are interested but uncertain about appropriate applications for their roles.

2. Need for institutional AI policies: Employees consistently call for clear college-wide guidelines on acceptable AI use, particularly regarding academic integrity. Many faculty feel they’re navigating challenges without adequate support or standardized approaches.

3. Student AI usage concerns: Faculty report widespread student use of AI across assignments, creating significant workload increases for detecting AI-generated content and raising concerns about skill development. The lack of detection tools and policies has created academic integrity challenges.

4. Training and education gap: There’s interest in learning more about effective and ethical AI applications, with specific requests for training on prompting techniques, evaluating AI-generated content, and understanding limitations.

5. Quality, reliability and ethical concerns: Many employees express skepticism about AI accuracy, data privacy implications, and ethical questions around content creation and intellectual property. There are also concerns about potential job displacement and loss of human connection in education.

Section 2.1
1. AI Tool Usage: Approximately 36% of students use AI tools very frequently (almost daily), while about 36% never use them. ChatGPT is most widely used (42%), with Grammarly (32%) and Microsoft Copilot (10.7%) following as distant seconds.

2. Use Cases: Students primarily use AI for understanding difficult concepts (36.3%), writing support (28.3%), grammar/proofreading (33%), study guides/exam prep (23%), and math problems (17.3%). This suggests students see AI as both a learning and practical assistance tool.

3. Policy & Guidelines: There appears to be confusion about when AI use is permitted— only 33.3% of students report that all their course syllabi clearly state AI policies, while 37% say few or none do. This highlights an opportunity to standardize AI policy communication.

4. Challenges: Students identify getting accurate/reliable results (37%) and knowing when AI use is allowed (26.7%) as their biggest challenges, suggesting a need for both technical guidance and clearer institutional policies. (29.3%) said they had no challenges.

5. Training Preferences: A majority (53.7%) said they are not interested in AI training, while (33.3%) prefer online self-paced training via Blackboard, (16%) prefer online via Zoom, and (20%) prefer in person, showing a demand for flexible learning options about AI tools.

Section 2.2
1. Divided perspectives on AI use in education: Students show strong polarization in their views. Some oppose AI use in academic settings, while others describe it as a “godsend” for learning support, particularly for understanding complex concepts.

2. Environmental and ethical concerns: Students expressed worry about AI’s environmental impact (energy consumption) and ethical issues (stealing original ideas from artists and writers), showing awareness of broader societal implications beyond the classroom.

3. Academic integrity worries: Students are concerned about peers using AI to avoid genuine learning, noting frustration with “generic AI answers” in discussions and cases where students “use AI, don’t study, and get good grades,” suggesting internal student policing of academic standards.

4. Calls for thoughtful, balanced integration: A significant theme was students advocating for AI as a learning aid rather than replacement – using metaphors like AI being “a passenger while I drive the car” and emphasizing the importance of limiting AI’s scope to maintain authentic learning.

5. Need for educational guidance: Students requested institutional guidance on appropriate AI use, suggesting that helping “students learn practical and appropriate ways to utilize AI could be quite beneficial”

Future Directions
Based on the survey results, I recommend we establish the following working groups to address key concerns and opportunities. We will continue to work closely with the MACH (Massachusetts Artificial Intelligence Collaborative for Higher Education), which is a collaborative initiative uniting Massachusetts state universities and community colleges to explore AI’s potential in higher education.

The proposed working groups align closely with the goals established by the AI Task Force:

Policy Development & Academic Integrity Committee: Develop clear, college-wide AI usage policies with specific guidelines for student coursework. We will work collaboratively with the Policy Group to ensure alignment with existing policies.

Faculty & Staff Professional Development: Create differentiated training programs addressing practical applications in teaching, administrative tasks, and ethical considerations. Develop a framework to embed AI-readiness into the curriculum. Format: Implement preferred multiple formats (hands-on workshops, online self-paced option).

Student Literacy & Support: Develop resources to help students use AI tools appropriately and understand boundaries focused on digital literacy, ethical usage guidelines, and balancing AI assistance with authentic learning.

Tool Evaluation & Administrative Implementation: Evaluate AI tools for administrative efficiency, address technical support needs, consider data privacy, security implications, and cost/access concerns.