How I rely on my university-taught writing skills, now that I use ChatGPT as my daily assistant at work: perspective of a recent graduate

How does a recent graduate transition from using AI tools with caution at university to embracing them at work?

In this article, UCM graduate Helen Frielingsdorf shares her experience of adapting to AI, particularly ChatGPT, in her professional life. She describes how the writing skills she developed at university still play a key role in her work and reflects on the impact of education in preparing students for their careers.

Writing with AI

Here I am, sitting in an open-concept office, chinos and a button-up, looking at many words appearing on my screen, as the company’s AI assistant chatbot is helping me formulate a newsletter article. I am pasting some of the input and meticulously checking every word. 

I evaluate the context of the input I gave the bot, and how well it is being represented by the statements that I can see in the marked text. Sometimes I am impressed by how nicely it formulated the sentences for me; the terms it used, that I would not have thought of myself and the simplicity and directness of the sentences. 

In other cases, I am surprised by how much the bot must have misunderstood what I was trying to say, as it reproduced complete gibberish that does not accord with the content of what I gave it to work with. It must be really stupid. But also, it is not really a person so it might be unfair to make that judgement. Still, I find myself feeling that I just won in a competition of some kind. 

From university restrictions to workplace integration

I just graduated from University College Maastricht and moved back to Cologne, Germany, to start an internship in sustainability and innovation management at the company Covestro. When I first started, I was surprised by the different approach that my co-workers have to the use of LLMs.  

As a student, I felt misunderstood and guilty about using the technology. At UCM, the use of ChatGPT and LLMs (Large Language Models) for assignments was strictly forbidden. Among the students, rumours were going around about another one of us “getting caught,” or “taken in” because they used the declared-to-be-illegal technology in one of their assignments. 

I was not completely relying on ChatGPT to complete my assignments but rather used it as assistance, for instance, to find better ways to formulate my writing or quiz myself before exams to make sure I had understood everything properly. I found the technology to be helpful in these cases, making studying and learning easier. I valued my education and was intentionally using it in ways that did not prohibit my learning. 

On the other hand, the communication that every use of LLMs such as ChatGPT was considered cheating, due to the conception that students may use it purely to get out of doing work, made me feel guilty. At the same time, I felt that the reason for forbidding the use of the technology was presumed to be a misuse of the former and took away the chances it offered.  

Applying university-taught writing skills

After university, the situation changed. Without prohibition, ChatGPT became a constant assistant in writing my applications, my CV, and now in my daily routine at work. Here, I am encouraged to use it for efficiency and independently explore ways to include it in my routines. I am noticing how the guilt feeling disappears bit by bit. 

One thing, however, remains. I am using the writing skills I learned at university every day. They accompany my collaboration with the LLM: I check every word, revising the paragraphs ChatGPT generates for me better than my own, and I am putting increasingly more time into planning the longer pieces I am writing. 

I remember how I was taught about the process of writing in the various skills courses I took at UCM: make a draft, don’t be scared to start – just write down your thoughts. Then, sort them out and revise, revise, revise. Make sure that your writing gets your point across. I also remember my thesis. I chose the topic because I was confused, and curious, about the role of AI – what we should learn, what skills we would need now that there is AI that can write for us, and what skills LLMs have already perfected beyond our own ability. 

Using ChatGPT at work

The difference between knowing language and using it

I remember learning and writing about the terms functional and formal linguistic competency. Formal linguistic competency was described as the ability to “use and effectively apply the rules that govern language, the ability to formulate sentences.” Functional linguistic competency on the other hand referred to “the ability to use language to achieve things in the real world.” A 2023 study by Kyle Mahowald showed that LLMs have formal linguistic abilities. Functional? Some, but not to the extent we do. 

While Mahowald used a very technical definition of the term, linked to specific skills, I am thinking about the aspect of the purposeful use of language that is being addressed in this definition. It compares language to a tool, differentiating between understanding the basic user manual and the ability to identify and apply it in fitting cases. While for writing and communicating knowledge we need to understand language and to know how to formulate sentences, we usually do not start a conversation for the purpose of using language. We have an idea, we understand the context this idea can be applied to, and we feel the need to communicate it. That is why we need language. 

But like with any tool, we first need to know what we want to accomplish by using it – in this case: we need to consider the idea we have in our mind and how we can best articulate it so that another person can understand it or even be convinced of an opinion of ours. While formal linguistic ability means that we know how to formulate proper sentences, functional linguistic ability provides us with the agency needed, to communicate our ideas and navigate our problems with language. 

Teaching students to collaborate effectively with AI

Students must develop the ability to evaluate and refine output language for their intended purposes, a skill not replaceable by AI. Exactly this may be the essence of what is often referred to as ‘digital literacy’: the kind of reasoning and articulation of thought that precedes and reflects on the use of LLMs and AI. 

Hereby, students need to learn to formulate thoughts and evaluate both their writing and AI-generated content critically. The longer I was at university, the more meticulously I found myself checking my writing and paying attention to every single part of a written piece and how it supports the main message. I learned the value and the importance of well-thought-through writing to achieve real-world objectives. 

A key lesson was receiving the feedback that I should not merely make sure to check boxes of content that should be in a paragraph (topic sentence, evidence etc) but rather have a goal in mind and learn to focus on making a clear, direct argument. I learned how important it is to keep the overall argument or idea in mind, and that every sentence needs to contribute to it in a strategic way. The final piece of writing should represent my idea in a holistic and convincing manner. I am still working on improving these skills every day and I am grateful for the guidance I received at university.  

Helen Frielingsdorf

AI and education

Especially in times of ChatGPT and other LLMs, educators should focus on teaching critical examination and revision of written pieces, including information checking as well as purposeful language use, fostering agency, and integrating AI responsibly into learning. This way students will be able to not only gain valuable skills in academic writing but also support efficient and responsible collaboration with a new technology.  

By Helen Frielingsdorf, UCM graduate 2024.

Also read