What ChatGPT suggests for career decision-making

ChatCPT and open AI chatbots are currently all the rage, with potential possibilities such as generating mashups of ideas in creative ways in a manner of moments, to problems such as generating entire papers that are undetectable for plagiarism, or stealing artwork and creating derivatives without artist permission or compensation.

Can ChatGPT be helpful for would be career decision-makers? What happens if I ask ChatCPT for help? No need to wonder – I did, and here’s the response.

What do you think? Not bad advice. It’s missing the “how to’s” – like how to do a self-assessment, and the “where to’s” – like where do I research information. There’s some other aspects that career researchers and practitioners will recognize as missing right away, such as thinking about your career path to this point, considering emotions, what barriers and support you have, and that the process of making a decision is more than just making a list, checking it twice, and trying it out. But, still, the advice seems pretty solid. So, I decided to drill down on self-assessment:

Hmm – this response is VERY interesting. Why these specific assessments? How did these rise to the top of the algorithm? MBTI is listed as #1? I’ve seen the research and it’s not pretty when you look at the MBTI and career. So, I asked a follow up question:

So the MBTI and the DISC have high levels of reliability and validity? I’ve never heard that, and have often heard the opposite. So, I queried a bit more, this time asking for references.

Hmm. See that I asked for research supporting – I did not ask for “any research” or research that negated – and what I was given for the MBTI was 2 articles against. Not ONE article in support – and yet, in the earlier search, the bot stated that both had HIGH reliability and validity. What? That seems a bit contradictory, doesn’t it? The second citation for the DISC is just confusing – it’s a book from 1986, with no clear link to the DISC. If it’s been around for decades, why are the first 2 citations given for it decades old and not particularly relevant? And do you see the takeaway for both? Should be used with caution? Yet, these were among the top 5 recommended?

What should be the takeaway? Once again, it seems like the AI chatbot COULD be useful for people wanting to make career decisions, but at the same time, has CLEAR LIMITATIONS and PROBLEMS. Most individuals will not ask the AI for the psychometric properties of an inventory, and if they did, they probably also wouldn’t ask for references – and if they did, they might see references posted and assume those references actually supported their statements. I have found the thought/reference citation matching to be way off and clearly wrong in some cases, on other AI searches I have conducted. More importantly, the AI’s advice is straightforward and simple, and while many career decision-makers can likely follow those steps and come to an informed career decision, the process of career decision-making is often more complex than this – otherwise our clients wouldn’t seek help if it was that easy. Research (e.g., Hayden & Osborn, Walker & Peterson) has repeatedly shown how other issues such as mental health, negative career thoughts, and external experiences such as discrimination, socio-economic status, and the like all impact our ability to make career decisions -and yet the bot did not acknowledge these. While they did suggest talking with a career practitioner(Yay!)- they didn’t share how to find one (Boo!).

As a career practitioner, it’s important to be aware that just as patients may google their symptoms and self-diagnose before seeing a doctor, our clients may do some searching online about how to make a career decision and come in bearing the outputs of the AI’s advice. I see this as similar to when someone takes a “color” test or a Facebook survey that tells them the career they should pursue and brings it in. You don’t insult the client by dismissing the results altogether, but instead use it as a launchpad for discussion. What did the “results” say that resonated with you? Would you like to take a different inventory that has been developed and researched for career interests (or skills, or values, etc.) or try a different approach (like card sorts, or writing your career story) to see what that might reveal?

It’s also part of our responsibility to help clients evaluate information about self and options (Sampson et al., 2018)- not in a preachy kind of way, but demonstrating how some of the information can be accurate and some of it may not be, and how to sift through it. This could be done by modeling, and searching, in the moment, online with the client and the practitioner speaking outloud as they evaluate links that come up (oh, this looks like it is a sponsored site or this is by the author, so that information is likely biased, or this was published in 1986 – wonder if there’s anything newer).

All in all, ChatGPT and AI chatbots are ultimately a tool that some clients will use to help them in their career decision-making process. It appears that some of the information is appropriate and may be helpful. We may never see those clients in our office. For some, that information will prove to be overwhelming as they seek to follow the straightforward steps, and that may lead them to knock on our door. In that case, the online conversation was the impetus to get help, so a net positive.

But what about those who are overwhelmed, and see this as yet another failure or additional proof that they can’t make good decisions – and as a result believe that seeking help will likely ending up in the same outcome? This is troubling. Perhaps our professional associations can seek to provide guidance to the AI developers to include that will encourage information and guidance seekers to seek support from professionals, or at least let them know that such support exists. Maybe as practitioners, we need to be more proactive in asking our clients about their online exploration. Ignoring it doesn’t mean it’s not happening. Bring it into the discussion. Don’t demonize ChatGPT but don’t canonize it either. It’s one resource among many – and it’s how we use it that will affect the outcome.

Hayden, S. C., & Osborn, D. S. (2020). Impact of worry on career thoughts, career decision state, and cognitive information processing-identified skills. Journal of Employment Counseling, 57(4), 163-177. https://doi.org/10.1002/joec.12152  

Sampson J. P., Jr., Osborn, D., Kettunen, J., Hou, P-C., Miller, A. K., & Makela, J. P. (2018). The validity of socially-constructed career information. Career Development Quarterly, 66(2), 121-134. Retrieved from http://purl.flvc.org/fsu/fd/FSU_libsubv1_scholarship_submission_1521225668_d3959a6c doi:10.1002/cdq.12127

Walker, J. V., III, & Peterson, G. W. (2012). Career thoughts, indecision, and depression: Implications for mental health assessment in career counseling. Journal of Career Assessment, 20, 497–506. https:// doi .org/10 .1177/1069072712450010 

Tech Twins Talk Tech With Peak Careers

The Tech Twins were honored to have a discussion with Jim Peacock of Peak Careers. We talked about “Must Have Technology tools,” as well as go-to resources we use, and even highlight some tips for preventing, minimizing, and combatting tech stress. Watch the Video:

For more info on the tools we mention, links are here. Loved talking with each other and with Jim, and it gives a sneak preview of some of what we’ll be sharing at NCDA.

Create a Branding Statement: 10 Questions and a Formula.

Branding relates directly to story – and personal story is something career practitioners are all about. The ability to succinctly articulate what one does and for whom is at the heart of not only branding, but vocational identity. We can help our clients clarify what solutions they offer to a potential employer through developing a branding statement. The questions below can help guide that process:

  1. What problem(s) can you solve and for whom?
  2. Why do you want to solve that problem?
  3. What message would you like to share?
  4. How would you describe your personality?
  5. How do people feel after working with you?
  6. How are you unique?
  7. Why do people trust you?
  8. What’s your story?
  9. Five words that describe you? Five words others would use to describe you?
  10. What are brands you admire? Why?

Certainly, there are additional questions out there that can stimulate thinking about branding, but these 10 should get the creative juices started. At the core, attempt to answer this question: What problem(s) can I solve and for whom? To help with this, try to complete the following formula:

FORMULA: I help ______________ do ____________________.

Some examples:

“I help high school students translate their dreams into reality.”

“Training the technologically timid.”

Challenge and a Question: Try your hand at writing a branding statement. If you get stuck, look over your resume, your calendar, your commitments and what they suggest about your brand. Questions: Do you like what it suggests? Is a change warranted? How does having a branding statement impact how you see yourself and opportunities around you?

Neat Site: PsyberGuide

Had a student present yesterday on the topic the impact of technology on psychological and mental health delivery. She shared a website called “PsyberGuide,” which is a really nifty website that critically evaluates apps based on credibility, transparency, and user experience. Here’s a brief video from Dr. Stephen Schueller, PsyberGuide Executive Director, describing the rationale for and goals of PsyberGuide.

 

Here’s a screen shot and link to their App Guide:

Screen Shot 2020-03-26 at 11.10.20 AM.png

It’s excellent to have a source dedicated to reviewing apps for quality. However, they don’t currently have career apps like we do in our tool library. They also don’t include the other technologies such as websites, blogs, podcasts, and youtube videos. That being said, perhaps if you notice an app of interest on our page, travel over to theirs to see what the review is.

Stay well!

Trying a New Tech Tool -Google Jamboard

I (Deb) teach a technology and counseling course in the summers, and each summer, I try to cover not only what is longstanding technology (telephone counseling, email advising/counseling, video chats, dropbox/google drive), but also to push the envelope in exploring other tools such as apps and also collaborative tools. This past week, I experimented with one of the tools in Google Drive, the “jamboard.”

jamboard.png

This class meeting was face-to-face, but I try to have them use technology regardless. The focus was on how to ethically integrate technology into face-to-face counseling, including what needed to occur prior to that decision, during (when with the client), and after it was introduced. They were divided into 3 groups of about 8 in each group and asked to use the sticky notes (but not talk) to brainstorm options for their group. Here’s an example of the before group:

Screen Shot 2019-06-09 at 12.27.34 PM.png

Following this, they were told to organize the stickies into similar themes. You can see the “during” group’s attempt at doing this as they started changing the colors to match the theme.

Screen Shot 2019-06-09 at 12.28.18 PM.png

Finally, they were asked to collapse similar ideas and then prioritize them into steps. This is the “after” group’s attempt to do this:

Screen Shot 2019-06-09 at 12.27.57 PM.png

Following this, we discussed each stage, and I added to it, and allowed other groups to add to each group’s ideas. Then, we processed the use of the tool, and how it might be used with a client or with other colleagues. We decided that the tool was useful for the first part of brainstorming, where everyone throws ideas up, and it gave everybody the chance to contribute. It became more difficult in the next steps, where the decision had to be made as to who would do the classifying, and who would prioritize the steps. Clearly, 8 people couldn’t do the prioritization, and there was no easy way to foster that decision. Someone would have to step up to be the leader, even if it was with the goal of delegating tasks (you 3 prioritize the green stickies, you 3 prioritize the blue…).

The class thought that this could be a useful resource with a client in a number of ways. If the client was struggling with anxiety or depression, this board could provide a number of creative strategies or reminders (e.g., cognitive reframes) to help them in the moment. By the counselor also adding in a few (hopefully evidence-based) ideas, this could also strengthen the working alliance. The board could also be used to house goals, steps, links to videos or resources, encouragements, and so forth.

As an instructor, I thought it was a useful tool. I hadn’t thought through the mechanics involved in the steps of ordering and prioritizing. I guess I figured they could figure that out – but it proved to be a situation where one person in each group just took over. If I were to do it over again, I’d probably provide some suggestions on how to go about those steps. My goal in not was to provide them with the freedom to explore and create without my being overly prescriptive – but the desired result didn’t occur. Next time, I might have a sticky that outlines next steps, such as providing specific steps that need to occur, enough so each person might have a task, and have each person to put a sticky with their name and task #, from which point they would proceed. All in all, it was a fun experiment. It achieved the goals of building experience with a new technological tool for the students, as well as helping them to think through the steps of integrating technology. I’ll probably keep this one with some minor modifications for next year.