Repository: jupediaz/chatgpt-prompt-splitter Branch: main Commit: 017961be5b16 Files: 18 Total size: 24.4 KB Directory structure: gitextract_891tnwvz/ ├── .editorconfig ├── .gitattributes ├── .gitignore ├── .vscode/ │ └── settings.json ├── CONTRIBUTING.md ├── LICENSE ├── Procfile ├── README.md ├── api/ │ ├── index.py │ └── templates/ │ └── index.html ├── requirements.txt ├── static/ │ ├── browserconfig.xml │ ├── chatgpt_prompt_splitter.psd │ ├── scripts.js │ ├── site.webmanifest │ └── styles.css ├── tests/ │ └── test_chatgpt_prompt_splitter.py └── vercel.json ================================================ FILE CONTENTS ================================================ ================================================ FILE: .editorconfig ================================================ { "arrowParens": "always", "bracketSpacing": true, "endOfLine": "lf", "htmlWhitespaceSensitivity": "css", "insertPragma": false, "singleAttributePerLine": false, "bracketSameLine": true, "jsxBracketSameLine": false, "jsxSingleQuote": false, "printWidth": 80, "proseWrap": "preserve", "quoteProps": "as-needed", "requirePragma": false, "semi": true, "singleQuote": false, "tabWidth": 4, "trailingComma": "es5", "useTabs": false, "vueIndentScriptAndStyle": false, "parser": "html" } ================================================ FILE: .gitattributes ================================================ *.html linguist-detectable=false *.css linguist-detectable=false ================================================ FILE: .gitignore ================================================ .vercel *.log *.pyc __pycache__ # Environments .env .venv env/ venv/ ENV/ env.bak/ venv.bak/ ================================================ FILE: .vscode/settings.json ================================================ { "cSpell.words": [ "dotenv", "upstash", "vercel" ] } ================================================ FILE: CONTRIBUTING.md ================================================ # Contributing I'm thrilled you are interested in contributing this ChatGPT Prompt Splitter. Your help may be essential to the project's success and I want you in! There are many ways to contribute to this project, and I appreciate all of them. ## How to contribute ### Reporting bugs This section guides you through submitting a bug report for this project. Following these guidelines helps maintainers and the community understand your report, reproduce the behavior, and find related reports. ### Suggesting enhancements This section guides you through submitting an enhancement suggestion for this project, including completely new features and minor improvements to existing functionality. Following these guidelines helps maintainers and the community understand your suggestion and find related suggestions. ### Your first code contribution Unsure where to begin contributing to this project? You can start by looking through the "good first issue" and "help wanted" issues: Thank you! ================================================ FILE: LICENSE ================================================ MIT License Copyright (c) 2023 Jose Diaz Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ================================================ FILE: Procfile ================================================ web: python index.py ================================================ FILE: README.md ================================================

ChatGPT PROMPTs Splitter

ChatGPT PROMPTs Splitter

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fjupediaz%2Fchatgpt-prompt-splitter) ### ❓ Have you ever received a message from ChatGPT about sending too much data and needing to send a shorter text? #### **Here's a great alternative to bypass this limitation!** 🚀 ![Error Message Too Long](/static/screenshots/screenshot_error_message_too_long.png) ## Overview **ChatGPT PROMPTs Splitter** is an open-source tool designed to help you split long text prompts into smaller chunks, making them suitable for usage with ChatGPT (or other language models with character limitations). The tool ensures that the text is divided into safe chunks of up to 15,000 characters per request as default, although can be changed. The project includes an easy-to-use web interface for inputting the long text, selecting the maximum length of each chunk, and copying the chunks individually to paste them to ChatGPT. ## Post on Medium You can read the full article on Medium: [ChatGPT PROMPTs Splitter: Split long text prompts into smaller chunks for ChatGPT](https://medium.com/@josediazmoreno/break-the-limits-send-large-text-blocks-to-chatgpt-with-ease-6824b86d3270) ## How it works The tool uses a simple algorithm to split the text into smaller chunks. The algorithm is based on the following rules: 1. Divide the prompt into chunks based on the specified maximum length. 2. Add information to the first chunk to instruct the AI on the process of receiving and acknowledging the chunks, and to wait for the completion of chunk transmission before processing subsequent requests. ## Features - Python 3.9 - Web interface for splitting text into smaller chunks - Customizable maximum length for each chunk - Copy chunks individually to send to ChatGPT - Instructions for ChatGPT on how to process the chunks - Tests included - Easy deployment to Vercel included ## Usage example Follow these simple steps to use the ChatGPT Prompt Splitter web application, illustrated with screenshots. ### Step 1: Access the application Open your web browser and navigate to the application URL. https://chatgpt-prompt-splitter.jjdiaz.dev/ You should see the main screen, displaying the input fields for your long text prompt and maximum chunk length. ![Set Max Length](/static/screenshots/screenshot_main_screen.png) ### Step 2: Input the long prompt Enter the text you want to split into smaller chunks for use with ChatGPT. You can also specify custom length for each chunk by entering the number of characters in the *"Max chars length..."* field. In this example, we are gonna split into chunks of just 25 characters. ![Input Text](/static/screenshots/screenshot_example_text.png) ### Step 3: Click "Split" Click the "Split" button to process the text and divide it into smaller chunks. ![Click Split](/static/screenshots/screenshot_example_text_splitted.png) ### Step 4: Copy the chunks The application will display the text divided into smaller chunks. You can copy each chunk individually by clicking the "Copy" button next to it. ![Copy Chunks](/static/screenshots/screenshot_example_copy_chunks.png) ### Step 5: Paste the chunks into ChatGPT Now that you have your chunks copied, you can paste them into ChatGPT or any other language model with character limitations. ![Paste Chunks](/static/screenshots/screenshot_example_paste_chunks.png) That's it! You've successfully split a **long PROMPT** into smaller, manageable chunks using the **ChatGPT Prompt Splitter**. ## Getting Started ### Prerequisites - Python 3.x - Flask ### Installation 1. Clone the repository: ```bash git clone https://github.com/jupediaz/chatgpt-prompt-splitter.git ``` 2. Change to the project directory: ```bash cd chatgpt-prompt-splitter ``` 3. Install the required dependencies: ```bash pip install -r requirements.txt ``` ### Usage #### Running the Flask application in development mode 1. Run the Flask application: ```bash vercel dev ``` 2. Open your web browser and navigate to . #### Deploy the Flask application to production 1. Deploy the Flask application: ```bash vercel --prod ``` 2. Open your web browser and navigate to . ## Running Tests This project includes a suite of unit tests to ensure the proper functionality of the tool. To run the tests, follow these steps: 1. Make sure you have the required dependencies installed: ```bash pip install -r requirements.txt ``` 2. Run the tests using the unittest module: ```bash python3 -m unittest discover tests ``` The test suite will run, and the results will be displayed in the terminal. ## License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. ## Contributing Contributions are welcome! Please read the [CONTRIBUTING](CONTRIBUTING.md) file for details on how to contribute to the project. ## Contact If you have any questions or suggestions, please contact me at [hello@jjdiaz.dev](mailto:hello@jjdiaz.dev). ## Disclaimer This project is not affiliated with OpenAI, Microsoft, or any other entity. The project is provided "as is" without warranty of any kind, express or implied. The author is not responsible for any damages or losses arising from the use of this project. ## Changelog ### 1.0.0 - Initial release ================================================ FILE: api/index.py ================================================ from flask import Flask, render_template, request from dotenv import load_dotenv import os import random import redis import string # Load environment variables from .env file load_dotenv() app = Flask(__name__) # Set up Redis client upstash_redis_url = os.environ.get('UPSTASH_REDIS_URL') redis_client = redis.from_url(upstash_redis_url) @app.route("/", methods=["GET", "POST"]) def index(): prompt = "" split_length = "" file_data = [] redis_client.incr("visit_counter") visit_count = int(redis_client.get("visit_counter")) if request.method == "POST": prompt = request.form["prompt"] split_length = int(request.form["split_length"]) file_data = split_prompt(prompt, split_length) hash_value = generate_random_hash(8) return render_template("index.html", prompt=prompt, split_length=split_length, file_data=file_data, hash=hash_value, visit_count=visit_count) def split_prompt(text, split_length): if split_length <= 0: raise ValueError("Max length must be greater than 0.") num_parts = -(-len(text) // split_length) file_data = [] for i in range(num_parts): start = i * split_length end = min((i + 1) * split_length, len(text)) if i == num_parts - 1: content = f'[START PART {i + 1}/{num_parts}]\n' + text[start:end] + f'\n[END PART {i + 1}/{num_parts}]' content += '\nALL PARTS SENT. Now you can continue processing the request.' else: content = f'Do not answer yet. This is just another part of the text I want to send you. Just receive and acknowledge as "Part {i + 1}/{num_parts} received" and wait for the next part.\n[START PART {i + 1}/{num_parts}]\n' + text[start:end] + f'\n[END PART {i + 1}/{num_parts}]' content += f'\nRemember not answering yet. Just acknowledge you received this part with the message "Part {i + 1}/{num_parts} received" and wait for the next part.' file_data.append({ 'name': f'split_{str(i + 1).zfill(3)}_of_{str(num_parts).zfill(3)}.txt', 'content': content }) return file_data def generate_random_hash(length): return ''.join(random.choice(string.ascii_letters + string.digits) for _ in range(length)) if __name__ == '__main__': app.run(host='0.0.0.0', port=int(os.environ.get('PORT', 3000))) ================================================ FILE: api/templates/index.html ================================================ Long PROMPTs Splitter

ChatGPT PROMPTs Splitter

Open-source tool for safely process chunks of up to 15,000 characters per request

Visit count: {{ visit_count }}
Enter the PROMPT that you want to use for the ChatGPT request.
{{ prompt|length }} characters
Choose the max length for each split part.
{% if file_data %}

Instructions

The total length of the content that I want to send you is too large to send in only one piece.
        
For sending you that content, I will follow this rule:
        
[START PART 1/10]
this is the content of the part 1 out of 10 in total
[END PART 1/10]
        
Then you just answer: "Received part 1/10"
        
And when I tell you "ALL PARTS SENT", then you can continue processing the data and answering my requests.
This way we explain ChatGPT how to process the messages we are gonna send.
{% for file in file_data %} {% set partNumber = file.name[6:9]|int %} {% set totalParts = file.name[13:16]|int %} {% endfor %}
{% endif %}
Powered by Vercel
================================================ FILE: requirements.txt ================================================ Flask==2.2.2 redis==4.5.1 python-dotenv==1.0.0 ================================================ FILE: static/browserconfig.xml ================================================ #da532c ================================================ FILE: static/scripts.js ================================================ document.getElementById("prompt").addEventListener("input", function () { updateSplitButtonStatus(); updatePromptCharCount(); }); document.getElementById('split_length').addEventListener('input', updateSplitButtonStatus); function copyToClipboard(element) { const textArea = document.createElement("textarea"); textArea.value = element.getAttribute("data-content"); document.body.appendChild(textArea); textArea.select(); document.execCommand("copy"); document.body.removeChild(textArea); element.classList.add("clicked"); } function copyInstructions() { const instructionsButton = document.getElementById("copy-instructions-btn"); const instructions = document.getElementById("instructions").textContent; const textArea = document.createElement("textarea"); textArea.value = instructions; document.body.appendChild(textArea); textArea.select(); document.execCommand("copy"); document.body.removeChild(textArea); instructionsButton.classList.add("clicked"); } function toggleCustomLength(select) { const customLengthInput = document.getElementById("split_length"); if (select.value === "custom") { customLengthInput.style.display = "inline"; } else { customLengthInput.value = select.value; customLengthInput.style.display = "none"; } } function updateSplitButtonStatus() { const promptField = document.getElementById('prompt'); const splitLength = document.getElementById('split_length'); const splitBtn = document.getElementById('split-btn'); const promptLength = promptField.value.trim().length; const splitLengthValue = parseInt(splitLength.value); if (promptLength === 0) { splitBtn.setAttribute('disabled', 'disabled'); splitBtn.classList.add('disabled'); splitBtn.textContent = 'Enter a prompt'; } else if (isNaN(splitLengthValue) || splitLengthValue === 0) { splitBtn.setAttribute('disabled', 'disabled'); splitBtn.classList.add('disabled'); splitBtn.textContent = 'Enter the length for calculating'; } else if (promptLength < splitLengthValue) { splitBtn.setAttribute('disabled', 'disabled'); splitBtn.classList.add('disabled'); splitBtn.textContent = 'Prompt is shorter than split length'; } else { splitBtn.removeAttribute('disabled'); splitBtn.classList.remove('disabled'); splitBtn.textContent = `Split into ${Math.ceil(promptLength / splitLengthValue)} parts`; } } function updatePromptCharCount() { const promptField = document.getElementById("prompt"); const charCount = document.getElementById("prompt-char-count"); const promptLength = promptField.value.trim().length; charCount.textContent = promptLength; } updateSplitButtonStatus(); ================================================ FILE: static/site.webmanifest ================================================ { "name": "", "short_name": "", "icons": [ { "src": "/android-chrome-192x192.png", "sizes": "192x192", "type": "image/png" }, { "src": "/android-chrome-384x384.png", "sizes": "384x384", "type": "image/png" } ], "theme_color": "#ffffff", "background_color": "#ffffff", "display": "standalone" } ================================================ FILE: static/styles.css ================================================ body, html { margin: 0; padding: 0; } body { font-family: Arial, sans-serif; background-color: #f0f0f0; display: flex; justify-content: center; align-items: center; overflow: auto; padding: 20px; } html { height: 100%; } *, textarea, input { box-sizing: border-box; font-family: inherit; } .container { max-width: 1200px; margin: 0 auto; padding: 20px; background-color: #fff; box-shadow: 0 2px 4px rgba(0, 0, 0, 0.1); } h1, h2 { text-align: center; width: 100%; margin: 0; } h1 { font-size: 1rem; } h2 { font-size: 11px; font-weight: normal; } label { display: block; font-weight: bold; margin-top: 10px; } textarea, input { width: -webkit-fill-available; } textarea { width: 100%; height: 200px; margin-bottom: 20px; } button { display: block; width: 100%; padding: 10px; background-color: #007bff; color: #fff; font-size: 16px; border: none; cursor: pointer; margin-top: 20px; } button:hover { background-color: #0056b3; } .buttons-container { display: flex; flex-wrap: wrap; justify-content: space-between; margin-top: 20px; } .copy-btn, .copy-instructions-btn { padding: 10px; border: none; cursor: pointer; color: white; text-align: center; font-size: 13px; } .copy-btn { background-color: #4caf50; flex-basis: min-content; margin-bottom: 10px; } .copy-btn:hover { background-color: #3e8e41; } .clicked { background-color: #888; color: #ccc; } .clicked:hover { background-color: #666; } .copy-instructions-btn { background-color: #FF8000; margin-top: 10px; } .copy-instructions-btn:hover { background-color: #CC6600; } .custom-length { font-size: 14px; width: 100px; height: 24px; } select { height: 24px; } .instructions { margin-top: 20px; font-size: 12px; background-color: lightgray; padding: 1px 10px 10px 10px; } help { font-size: 12px; color: #888; } button[disabled] { background-color: #cccccc; color: #666666; cursor: not-allowed; } .char-count-container { text-align: right; } .header { display: flex; justify-content: space-between; align-items: center; width: 100%; } .logo { max-width: 90%; } .header-text { display: flex; flex-direction: column; align-items: flex-start; justify-content: center; width: 80%; } .header-logo { width: 20%; display: flex; justify-content: flex-end; padding-right: 10px; } pre { overflow-x: auto; white-space: pre-wrap; word-wrap: break-word; max-width: 100%; padding: 1em; background-color: #f8f8f8; border-radius: 5px; } footer { margin-top: 20px; padding-top: 10px; border-top: 1px solid #ccc; display: flex; width: 100%; bottom: 0; font-size: 12px; } .left, .right { width: 50%; } .left { text-align: left; } .right { text-align: right; } .powered-by-vercel { display: flex; align-items: center; justify-content: center; margin-top: 20px; } .powered-by-vercel img { width: 180px; height: auto; display: flex; align-items: center; justify-content: center; } .visits { text-align:right; font-size: 14px; } .how-this-works { font-size: 14px; text-align: center; width: 100%; } @media (min-width: 768px) { textarea { height: 300px; } h1 { font-size: 2rem; margin-bottom: 20px; } h2 { font-size: 16px; margin-bottom: 30px; } } ================================================ FILE: tests/test_chatgpt_prompt_splitter.py ================================================ import unittest import sys import os sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))) from api.index import split_prompt class TestChatGPTPromptSplitter(unittest.TestCase): def test_split_prompt_single_chunk(self): input_text = "This is a short text." max_length = 50 chunks = split_prompt(input_text, max_length) self.assertEqual(len(chunks), 1) self.assertIn(input_text, chunks[0]['content']) def test_split_prompt_multiple_chunks(self): input_text = "This is a long text that should be split into multiple chunks." max_length = 20 chunks = split_prompt(input_text, max_length) self.assertEqual(len(chunks), 4) def test_split_prompt_chunk_length(self): input_text = "This is a long text that should be split into multiple chunks with a specified maximum length." max_length = 30 chunks = split_prompt(input_text, max_length) for chunk in chunks: self.assertLessEqual(len(chunk), max_length) def test_split_prompt_empty_input(self): input_text = "" max_length = 50 chunks = split_prompt(input_text, max_length) self.assertEqual(len(chunks), 0) def test_split_prompt_negative_max_length(self): input_text = "This is a short text." max_length = -10 with self.assertRaises(ValueError): split_prompt(input_text, max_length) ================================================ FILE: vercel.json ================================================ { "rewrites": [ { "source": "/(.*)", "destination": "/api/index" } ] }