Welcome to NexusFi: the best trading community on the planet, with over 150,000 members Sign Up Now for Free
Genuine reviews from real traders, not fake reviews from stealth vendors
Quality education from leading professional traders
We are a friendly, helpful, and positive community
We do not tolerate rude behavior, trolling, or vendors advertising in posts
We are here to help, just let us know what you need
You'll need to register in order to view the content of the threads and start contributing to our community. It's free for basic access, or support us by becoming an Elite Member -- see if you qualify for a discount below.
-- Big Mike, Site Administrator
(If you already have an account, login at the top of the page)
With the advent of ChatGPT we are seeing more and more automation when it comes to posts. Sometimes the entire post can be created by ChatGPT or other large language models similar to ChatGPT. Sometimes it's only a small portion of the message that was created in an automated fashion.
The question is, what should we do about it here within the community? Should we flag this material? Should we ban it? Should we require disclosure? We need to have a frank discussion so that site rules can be amended and appropriate changes can be made to the user interface if desirable so that such content can be flagged.
The mods have been talking about this for a while, but now it's time to open it up to community discussion.
My first thought was that by banning it outright, futures.io might deprive itself of some cutting-edge and insightful information. Disclosure policy might be something. Since in many instances it can easily be detected, non-disclosure could be caught a lot of the time.
My second thought was to ask ChatGpt for advice on this. Yeah
Here it is:
It's indeed a relevant discussion, especially considering the advancements in AI language models like GPT-3 and the expected increase in automated content generation. Here are some potential strategies for handling AI-generated content:
1. Disclosure Requirements: A possible approach is to require users to disclose when a post or a portion of a post has been generated by an AI. This could be similar to how sponsored posts are handled on many platforms. Users could be asked to mark their posts with a tag or disclaimer if AI-generated content is included. This fosters transparency while still allowing users to benefit from the insights provided by AI.
2. AI Use Guidelines: Instead of banning AI-generated content outright, you could create guidelines for appropriate use of AI on the platform. These could address issues like plagiarism, spam, or misleading information. The guidelines could also encourage users to fact-check AI-generated content before posting to ensure accuracy and reliability.
3. AI Content Flags: This could be an extension of the disclosure requirement. Once a post has been identified as AI-generated, the platform could flag it, so other users are aware.
4. AI Review System: Consider developing an AI content review system. This could be an automated process that identifies AI-generated content based on certain characteristics. If a post is flagged, it could be reviewed by moderators before it goes live.
5. Community Feedback: Most importantly, listen to your community's feedback. After explaining what AI-generated content is and what it could mean for the platform, poll your users or open a discussion thread. Their input will be valuable in shaping the community's stance on AI-generated content.
6. User Education: Along with the rules and guidelines, it might be a good idea to educate the community members about the potential drawbacks and advantages of AI-generated content.
Remember, AI can be a powerful tool to assist users in many ways such as generating trading strategies, creating trading-related content, etc., but it is also important to maintain the human connection and authenticity within the community. The strategy should strike a balance between leveraging AI capabilities and maintaining the organic nature of community interactions.
You can't outsource confidence in trading decisions
I agree with @bwolf posted. Valid points.. Looking at it from a different perspective, AI could be good to bring people out of their shell. Mike you know how many people dont post or even say thanks on FIO. Part of that is fear of being called out or looking bad.
If you are not good at formulating a post then AI could be a way to put a few thoughts down and have it expand it in a comprehensible way on what your thinking. How many thoughts could AI unlock if they can get past the fear of looing bad?
BTW- as a afterthought.. perhaps you could have an anonymous thanks button. If Social Media has proven anything its that anonymity brings people out in droves when what they are responding too cannot be tied back to them.
The fear of looking bad is a real thing that holds people back, so there is something in this.
On the other hand, one of the things that I personally would not want to see is basically people not writing their own thoughts and their own experiences, but using the AI to substitute for them.
The genuine experiences of real traders is what is most valuable about the forum. Yes, an AI can probably write better than a lot of people, and probably could make an expanded, comprehensible post out of it. But then at least in part it wouldn't be from the trader, it would be from the program. What came from the original, and what from the machinery? What is real and what is dressed up and tidied up, or expanded on in ways that came from the algorithm and not from the trader?
I'm supportive of our learning how to use these new AI's to our benefit. But I want to be talking to other people, who may have something to tell me, and not to ChatGPT, who is just putting words together, and isn't a human being who is trading. I can often learn from a real trader who is struggling with what I am struggling with. At best, a generative AI is giving a consensus of all the many millions of samples of text that it has trained on. Getting that consensus can be good, but it's not the same as talking to a person who is doing it.
Which does not mean there's no role for AI. This is just a way to state one of the issues.
Bob.
When one door closes, another opens.
-- Cervantes, Don Quixote
Trading: Primarily Energy but also a little Equities, Fixed Income, Metals and Crypto.
Frequency: Many times daily
Duration: Never
Posts: 5,049 since Dec 2013
Thanks Given: 4,388
Thanks Received: 10,207
@Big Mike 's OP says "With the advent of ChatGPT we are seeing more and more automation when it comes to posts". I'm not sure what "automation" actually means here. If he means there are now bots making posts using AI/ChatGPT/etc then yes I would lean towards stopping that. But if he means people are using AI/chatGPT/etc to help them write posts then I don't see a problem with that. I assume that anybody doing that wouldn't waste the time to create posts they didn't agree with.