pygmalion-2-13b

Maintainer: PygmalionAI

Total Score

72

Last updated 5/27/2024

🔍

PropertyValue
Model LinkView on HuggingFace
API SpecView on HuggingFace
Github LinkNo Github link provided
Paper LinkNo paper link provided

Create account to get full access

or

If you already have an account, we'll log you in

Model overview

The pygmalion-2-13b is an instruction-tuned version of the Llama-2 large language model, developed by PygmalionAI. It is biased towards fiction writing and conversation, and is an evolution of the earlier Pygmalion models. The model was trained on a mixture of regular instruction data alongside roleplay, fictional stories and conversations with synthetically generated instructions. While similar to the Pygmalion-13b and Pygmalion-7b models, the pygmalion-2-13b has been designed with a focus on more natural language interaction and creative writing tasks.

Model inputs and outputs

Inputs

  • The model takes in text prompts, which can include special tokens such as <|system|>, <|user|> and <|model|> to indicate different roles and control the length and style of the generated output.

Outputs

  • The model generates text, primarily focused on creative writing, storytelling, and natural language conversations.

Capabilities

The pygmalion-2-13b model is particularly skilled at generating coherent and engaging fictional narratives, as well as natural-sounding dialogue. It can be used to aid in the creative writing process, generate character backstories and worldbuilding details, and facilitate interactive roleplay scenarios.

What can I use it for?

The pygmalion-2-13b model could be useful for a variety of creative and conversational applications, such as:

  • Assisting writers and storytellers in ideation, character development, and narrative generation
  • Powering interactive fiction and choose-your-own-adventure experiences
  • Enabling more natural and immersive chatbots and virtual assistants
  • Facilitating collaborative worldbuilding and roleplaying games

Things to try

One interesting aspect of the pygmalion-2-13b model is its ability to maintain coherent personas and narratives over longer conversational exchanges. You could try prompting the model with a character description and backstory, then engaging in an extended dialogue to see how it develops the character and storyline. Additionally, experimenting with different prompt structures and special tokens could yield interesting creative results.



This summary was produced with help from an AI and may contain inaccuracies - check out the links to read the original source documents!

Related Models

💬

pygmalion-2-7b

PygmalionAI

Total Score

53

pygmalion-2-7b is an instruction-tuned Llama-2 7B model developed by PygmalionAI that is biased towards fiction writing and conversation. It was fine-tuned on a mixture of regular instruction data alongside roleplay, fictional stories, and conversations with synthetically generated instructions. Similar models include the pygmalion-2-13b, which is a 13B version of the model, and the metharme-7b, which is an earlier experiment in this space. Model inputs and outputs The pygmalion-2-7b model takes in text prompts and generates text outputs. The prompts use a specific format with different roles denoted by `, , and ` tokens. The system prompt can be used to set the model's context, the user prompt indicates user input, and the model token signals that the model should generate a response. Inputs Prompts**: Text prompts using the role tokens to set context, provide user input, and signal the model to generate a response. Outputs Generated text**: The model will generate text outputs in response to the provided prompts, automatically emitting an end-of-text token when the response is complete. Capabilities The pygmalion-2-7b model is capable of generating fictional stories, roleplaying scenarios, and engaging conversations. It can be guided using natural language prompts to take on different personas and respond accordingly. The model has been trained on a diverse dataset, allowing it to draw from a wide range of knowledge and styles. What can I use it for? The pygmalion-2-7b model is well-suited for creative writing and storytelling applications. It could be used to assist authors in generating ideas, drafting narratives, or collaborating on fictional worlds. The model's conversational abilities also make it a potential tool for interactive fiction, chatbots, or other dialogue-driven experiences. For businesses, the model could be leveraged to generate engaging content for marketing, entertainment, or customer interaction purposes. It could also be fine-tuned further for specific use cases or integrated into larger AI-powered applications. Things to try One interesting thing to try with the pygmalion-2-7b model is to experiment with different personas and prompting styles. The model's flexibility allows it to take on a wide range of characters and respond in various tones and voices. You could try prompting it to play the role of a whimsical wizard, a brooding detective, or a jolly pirate, and see how it adapts its language and storytelling to fit the character. Another idea is to use the model to collaboratively build out a fictional world or narrative. Start with a high-level prompt, then take turns adding details, plot points, and dialogue, letting the model fill in the gaps and build upon the shared vision. This interactive approach could lead to unexpected and engaging results.

Read more

Updated Invalid Date

📈

mythalion-13b

PygmalionAI

Total Score

133

The mythalion-13b model is a merge of the Pygmalion-2 13B and MythoMax L2 13B models, created in collaboration between PygmalionAI and Gryphe. According to the maintainers, this model seems to outperform MythoMax in roleplay and conversation tasks. Model inputs and outputs Inputs The model can be prompted using both the Alpaca and Pygmalion/Metharme formatting, which utilize special tokens like `, , and ` to indicate different roles and conversation flow. Outputs The model generates long-form text responses that aim to stay in character and continue the narrative, making it suitable for fictional writing and roleplaying. Capabilities The mythalion-13b model is focused on generating engaging, character-driven text for creative writing and roleplay scenarios. It has been trained on a mixture of instruction data, fictional stories, and conversational data to develop its capabilities in these areas. What can I use it for? The mythalion-13b model is well-suited for projects involving fictional writing, interactive storytelling, and character-driven roleplaying. This could include applications like interactive fiction, creative writing assistants, and open-ended chat bots. However, the maintainers note that the model was not fine-tuned to be safe or harmless, so it may generate content that is socially unacceptable or factually incorrect. Things to try One interesting aspect of the mythalion-13b model is its use of the Pygmalion/Metharme prompting format, which allows the user to set the character persona and guide the model's responses to stay in-character. Experimenting with different character backgrounds and personas could lead to unique and engaging narrative experiences.

Read more

Updated Invalid Date

📉

metharme-7b

PygmalionAI

Total Score

54

The metharme-7b is an instruction-tuned LLaMA model developed by PygmalionAI that is biased towards fiction writing and conversation. It is similar to other PygmalionAI models like pygmalion-2-13b, pygmalion-7b, and pygmalion-13b, which are also based on the LLaMA architecture and fine-tuned for conversational and creative tasks. Model inputs and outputs The metharme-7b model accepts natural language instructions as inputs, which can include prompts for the model to engage in fictional writing, roleplaying, and open-ended conversation. The model outputs coherent, context-appropriate text in response to the provided instructions. Inputs Natural language instructions**: The model accepts prompts and instructions written in natural language, which can include requests for the model to engage in fictional writing, roleplay, or open-ended conversation. Outputs Textual responses**: The model generates relevant, context-appropriate text in response to the provided instructions. Outputs may include fictional stories, dialogue, or other creative writing. Capabilities The metharme-7b model is capable of understanding and generating text for a variety of creative and conversational tasks. It can be used to produce fictional stories, engage in roleplay, and hold open-ended conversations. The model's capabilities make it well-suited for applications in entertainment, gaming, and creative writing. What can I use it for? The metharme-7b model can be used for a variety of creative and conversational applications, such as: Fiction writing**: The model can be used to generate original fictional stories, dialogue, and other creative writing based on provided prompts or instructions. Roleplaying and interactive fiction**: The model can be used to simulate characters and engage in roleplaying scenarios, creating immersive and responsive experiences for users. Conversational AI**: The model's conversational capabilities can be leveraged to build chatbots and virtual assistants that can engage in open-ended discussions on a wide range of topics. Things to try Some interesting things to try with the metharme-7b model include: Experimenting with different prompts and instruction formats to see how the model responds in various creative and conversational scenarios. Combining the model's capabilities with other tools or technologies, such as interactive fiction platforms or virtual roleplaying environments, to create unique and engaging experiences for users. Exploring the model's limitations and biases, and considering ways to mitigate any undesirable outputs, such as through careful prompt engineering or the use of content filtering.

Read more

Updated Invalid Date

🐍

pygmalion-13b

PygmalionAI

Total Score

109

pygmalion-13b is a conversational language model based on Meta's LLaMA-13B. It has been fine-tuned using a subset of the data from the Pygmalion-6B-v8-pt4 project. This model is version 1 and is part of the Pygmalion line of dialogue models from PygmalionAI. The pygmalion-7b model is a similar conversational model based on the LLaMA-7B, also fine-tuned using Pygmalion-6B-v8-pt4 data. Both models follow the usual Pygmalion persona and chat format. Model inputs and outputs Inputs Text prompt**: The model expects input text following a specific format, including the character's persona and dialogue history. Outputs Generated text**: The model outputs generated text responses continuing the dialogue. Capabilities The pygmalion-13b model is designed for open-ended conversational tasks. It is capable of engaging in back-and-forth dialogues, taking on different personas, and generating relevant and coherent responses. The model can discuss a wide range of topics, drawing upon its broad knowledge base. What can I use it for? The pygmalion-13b model can be used for various interactive applications, such as chatbots, virtual assistants, or creative writing tools. Its conversational abilities make it well-suited for applications where natural language interaction is important, like customer service, education, or interactive entertainment. Things to try One interesting aspect of the pygmalion-13b model is its ability to adapt to different personas and styles of interaction. You could experiment with providing varied persona descriptions to see how the model responds and develops the character over the course of a conversation. Additionally, trying out different prompting strategies, such as including more or less dialogue history, can yield interesting results and insights about the model's capabilities.

Read more

Updated Invalid Date