Wastholm.com

requirements.txt (or “requirements in setup() call”) is not a valid way to manage dependencies — and it hasn’t been for the past 5+ years. If you are still using requirements.txt it shows you need professional help. luckily, i’m a professional.

Let’s go over some bad / good / example practices for living your best python life in 2024.

LLM provides a Python API for executing prompts, in addition to the command-line interface.

[...]

To run a prompt against the gpt-3.5-turbo model, run this:

import llm

model = llm.get_model("gpt-3.5-turbo")
model.key = 'YOUR_API_KEY_HERE'
response = model.prompt("Five surprising names for a pet pelican")
print(response.text())

Simple command line tool for text to image generation using OpenAI's CLIP and Siren.

Mist.io helps you manage and monitor your virtual machines, across different clouds, using any device that can access the web.

Define user behaviour with Python code, and swarm your system with millions of simultaneous users.

Python, on the other hand, has problems of its own. The biggest is that it has dozens of web application frameworks, but none of them are any good. Pythonists are well aware of the first part but apparently not of the second, since when I tell them that I’m using my own library, the universal response is “I don’t think Python needs another web application framework”. Yes, Python needs fewer web application frameworks. But it also needs one that doesn’t suck.

Detect the language of text.

Every major language has thousands of libraries which enable programmers to reach higher, further and faster than before. Package managers (the online systems for sharing code) are key to a language's success; Perl, PHP, Python, Ruby and Node.js all have strong offerings. But which one is the best and what can we learn from each of them? This article is the first in a two-part series where I review each package manager. Part one focuses on searching and using packages and part two will look at how easy it is to upload and share packages.

TextBlob is a Python (2 and 3) library for processing textual data. It provides a simple API for diving into common natural language processing (NLP) tasks such as part-of-speech tagging, noun phrase extraction, sentiment analysis, translation, and more.

Last week, while working on new features for our product, I had to find a quick and efficient way to extract the main topics/objects from a sentence. Since I’m using Python, I initially thought that it’s going to be a very easy task to achieve with NLTK. However, when I tried its default tools (POS tagger, Parser…), I indeed got quite accurate results, but performance was pretty bad. So I had to find a better way. Like I did in my previous post, I’ll start with the bottom line – Here you can find my code for extracting the main topics/noun phrases from a given sentence. It works fine with real sentences (from a blog/news article). It’s a bit less accurate compared to the default NLTK tools, but it works much faster!

1–10 (50)   Next >   Last >|