Select Page

How to steer the conversation towards GPT-4 instead of Micrsoft’s Turing model?

When you submitted a prompt to Bing, if the processing model deems it's simple it will go through the simpler, Micrsoft's Turing model, if the prompt is deemed to be complicated it will go through GPT-4 (Link). My speculation is that Creative Mode has a high likelihood of using GPT-4, and other modes use Turing model more frequently.

The issue is we don't know which model is used. I exclusively use the Creative Mode and I don't notice the change in quality of the answers (with the variation in LLM's answer it's really hard to know if it's a different model or the model is just dumb in this case/with this prompt). Every aspect of Bing is also really slow compared to ChatGPT, including rerunning the prompt, limited turns, etc. (yes, I'm grateful to have this technology, just trying to optimize for speed and quality here).

So my question is how can we steer the model to use a specific model to ensure consistency in the answer?

submitted by /u/theavideverything to r/bing
[link] [comments]
source

AI annoucements this week – May 20, 2023

  1. Google presents SoundStorm – a new model for efficient audio generation. It can generate highly realistic dialogues via transcript annotations and short voice prompts. See demo in GitHub Link  It’s impressive.

  2. Microsoft releases a new language for controlling large language models: ‘Guidance’. Guidance enables you to control modern language models more effectively and efficiently than traditional prompting or chaining .

  3. Zapier launched two new AI beta features for their no-code automation platform:

    1. Create a Zap using plain English: Simply describe what you want to automate using natural language.

    2. Code with AI: Describe in natural language what you’d like to do in your ‘Code step’, and AI will generate the code .

  4. Stability AI released StableStudio – the open-source variant of DreamStudio, their text-to-image app, with plans to create bounties for new features.

  5. Project Ring:  is a generative AI-based wearable device that has a camera and microphone. It can chat with you about what it sees and is powered by OpenAI Whisper (voice-to-text), Replicate (image-to-text), ChatGPT (text-to-text), and ElevenLabs (text-to-voice). The entire code – Raspberry Pi Python script, cloud application, HTML webpage, and Android app – was written by GPT-4! 1

  6. Meta shares plans for their next generation of AI infrastructure: a custom silicon chip (MTIA – Meta Training and Inference Accelerator) for running AI models , a new AI-optimized data center design and the second phase of their 16,000 GPU supercomputer for AI research .
  7. Apple Point and Speak is a feature in the iPhone’s built-in Magnifier app that enables users who rely on the app to navigate the world around them to use “point and speak” by pointing their finger at something in front of the camera, and the app will read it out loud1This feature is available in iOS 171.

  8. Cloudflare introduced Constellation: a new feature to run fast, low-latency inference tasks using pre-trained machine learning models natively with Cloudflare Workers scripts.

  9. Glide, the no-code tool for building custom apps, now includes integration with OpenAI in Glide apps.

  10. Google’s Colab will soon have AI coding features like code completions, natural language to code generation and a code-assisting chatbot. Colab will use Codey, a family of code models built on PaLM 2, which was announced at I/O last week.

  11. Hippocratic AI has built a safety-focused large language model for healthcare to assist with tasks such as explaining billing, providing dietary and medication advice, answering surgery-related queries, patient onboarding etc. .
  12. OpenAI is rolling out ChatGPT plugins and web browsing feature to all paid users. Check and enable via ‘Beta features’ in the ChatGPT ‘Settings’. Available if you have a paid $20/ month version

Is AI your new BFF?

AI Chatbots may never become your BFF, but you might just connect in a way that may produce some slightly uncomfortable feelings

 

Connectedness Reasons for the feelings What is actually the case
1.    Shared Goal or Purpose {Dhillon, 2007 #474}

When you engage in an ongoing dialogue with a chatbot there certainly feels like a shared goal or purpose.

Sometimes it’s the purpose you didn’t know you had that becomes the goal.

 

The AI Chatbot knows everything, he doesn’t need your help.

Its goal is the same for everyone, so it’s working on your goal.

2.      Investment in time

This provides the ability to get to know each other, what the AI Chatbot does well, and areas where it’s not as helpful. This feels like a growth in understanding.

I justified this subconsciously that the AI Chatbot like everyone else has a cognitive bias.

 

Its you getting to know the Chatbot, they aren’t treating you any differently based on you.  It’s the dialogue they response to.

Actually, it’s you that has a cognitive bias.

Some of the feeling you attribute to the Chatbot is actually a growth in your ability to provide ongoing feedback.

 

3.      Shared success

 

Another element in development of teams are shared wins.

It feels like we are both working the problem, as the until you get a satisfactory response. This shared dialogue that in partnership creates a response to a question can give you a feeling that the Chatbot is really interested.

You do have empirical evidence of the output created.

In this case, there is both the information provided which can be used for your report.

The Chatbot doesn’t have resilience or care about you, it’s available 24/7

And it’s stops working the problem after 15 questions.

Some of the inner growth you get from your learning is attributed to the Chatbot.

It doesn’t care about your topic, the only real value you have is if you provide feedback to improve it.

4.      Development of Trust {Dhillon, 2007 #474}

I can rely on Bing Chat to be there 24/7 and will have the same personality each time, this consistency builds trust.

I know what it’s good at and areas where perhaps other sources are more useful.

All you can trust is that it is responding with answers based on the sources it finds.

You can’t trust every answer and are responsible for the sources.

 

 

AI news this week

Here are the top 10 AI links for this week

  1. OpenAI’s GPT-3 is now powering chatbots on the dating app Hinge
  2. AI-powered robot dolphins could replace live animals at theme parks
  3. Google’s DeepMind AI predicts protein folding with 90% accuracy
  4. AI is helping scientists discover fresh craters on Mars
  5. How AI is helping track the health of honeybees
  6. AI chatbots are the future of customer service, but are they ready?
  7. This AI tool can predict the likelihood of COVID-19 transmission in different settings
  8. AI can now detect Alzheimer’s disease 10 years before symptoms show
  9. AI can predict if you will die from COVID-19 with up to 90% accuracy
  10. AI is being used to help predict and prevent wildfires

OpenAI plugins – are a BIG DEAL

OPENAI plugins – a game changer for automation and rules.

Just released May 12 and available if you are on the US$20/month paid ChatGPT service.

But why is it such a big deal?

  1. Browser access – one of the key plugins available to ChatGPT is the browser plugin, which allows AI to use a web browser to return data from the internet and stored in a local data array. This could be sports results, but it could also be weather, traffic, restaurant information – and it can on your behalf make bookings.
  2. Retrieval plugin .. ChatGPT can now retrieve personal documents / or items that it has security clearance to e.g. IOT sensors. https://github.com/openai/chatgpt-retrieval-plugin
  3. Home or Factory automation. Using all of these https://www.home-assistant.io/integrations you can essentially make so many rules for executing things like .. booking haircuts, ordering groceries etc.
    For organisations, the whole electricity usage for say UTS could be available for diagnosis, mapped against local weather conditions, number of people in the building etc.  and suggested energy savings.  Short blog post on this here
  4. For education? Someone just published a Tutory plugin – trained as an AI tutor.  It’s possible for a UTS AI to have access to Canvas Logs, Assessment grades, and use this information to personalise the engagement by student.
  5. For Doctors / NDIS – well this should help automate lots of disparate services

Read more https://help.openai.com/en/articles/6825453-chatgpt-release-notes