I used chatGPTs research feature the other day. I was thinking about writing about a local secret club that’s been in the press a lot both with facts and conspiracy theories. 11 minutes, 15 pages and 41 sources later, I got a research paper. chatGPT likes footnote too!
Have you tried Deep Research in conjunction with a reasoning model? That's like my new favorite research tool (at least for complex things I really need to learn about).
no, tbh this was my first use of the research feature. It suits more of an async read and ponder versus back-and-forth convo that Ive used for brainstorming up to this point. I haven't used explicit model switching muc. I think Daniel made an agent to pick the best model based on use case and I wonder if we're in a zone of model proliferation where choosing gets abstracted; it feels like a barrier to adoption.
I think it's temporary and you will just be able to start talking (or typing, or interacting in any way really) with an AI and then all this stuff will happen behind the scenes, just completely under the surface, and in a split second to boot.
This is the worst it's ever gonna get. That's shocking.
The great thing about the internet is that once you delete a chat someone else has already seen, they'll actually unsee it in that same instant. HTML is weird.
I used chatGPTs research feature the other day. I was thinking about writing about a local secret club that’s been in the press a lot both with facts and conspiracy theories. 11 minutes, 15 pages and 41 sources later, I got a research paper. chatGPT likes footnote too!
Have you tried Deep Research in conjunction with a reasoning model? That's like my new favorite research tool (at least for complex things I really need to learn about).
no, tbh this was my first use of the research feature. It suits more of an async read and ponder versus back-and-forth convo that Ive used for brainstorming up to this point. I haven't used explicit model switching muc. I think Daniel made an agent to pick the best model based on use case and I wonder if we're in a zone of model proliferation where choosing gets abstracted; it feels like a barrier to adoption.
I think it's temporary and you will just be able to start talking (or typing, or interacting in any way really) with an AI and then all this stuff will happen behind the scenes, just completely under the surface, and in a split second to boot.
This is the worst it's ever gonna get. That's shocking.
this consumer stuff - its just the tip. if we're talking iceberg that is.
Yes, completely. Just for a second, though.
Heehee
"I’ve decided to keep it private—just between me, the NSA, and whoever else is on today’s Signal chat."
Don't worry, your secret is safe with us. Uh, I'm afraid I've said too much. Delete chat. DELETE CHAT.
(Also, thanks for the shoutout!)
The great thing about the internet is that once you delete a chat someone else has already seen, they'll actually unsee it in that same instant. HTML is weird.
What happens in a Signal chat, stays in a Signal chat. That's Newton's first law, I believe.
Gabbard's 2nd Law, but who's counting?