mozz to [email protected] • 1 year agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square200fedilinkarrow-up1488arrow-down10file-text
arrow-up1488arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz to [email protected] • 1 year agomessage-square200fedilinkfile-text
minus-squareJackGreenEarthlinkfedilinkEnglish7•1 year agoYes, but what LLM has a large enough context length for a whole book?
minus-square@[email protected]linkfedilinkEnglish8•1 year agoGemini Ultra will, in developer mode, have 1 million token context length so that would fit a medium book at least. No word on what it will support in production mode though.
minus-squareJackGreenEarthlinkfedilinkEnglish3•1 year agoCool! Any other, even FOSS models with a longer (than 4096, or 8192) context length?
Yes, but what LLM has a large enough context length for a whole book?
Gemini Ultra will, in developer mode, have 1 million token context length so that would fit a medium book at least. No word on what it will support in production mode though.
Cool! Any other, even FOSS models with a longer (than 4096, or 8192) context length?