Open post Prompt engineering

Prompt engineering: is being an AI ‘whisperer’ the job of the future?

As generative AI settles into the mainstream, growing numbers of courses and certifications are promising entry into the “hot job” of prompt engineering. Having skills in using natural language (such as English) to “prompt” useful content out of AI models such as ChatGPT and Midjourney seems like something many employers would value. But is it...

Open post Large Language Models

What are Large Language Models?

In the realm of artificial intelligence, Large Language Models (LLMs) are increasingly becoming the linchpin, the foundational architecture that’s driving a new wave of innovation. If you’re a developer or a data scientist, you’ve likely encountered the acronyms and the buzz—GPT from OpenAI, Google’s PaLM2 (the underpinning of their Bard chatbot), and Falcon. These aren’t...

Open post LoRA and Finetuning

LoRA vs. Fine-Tuning LLMs

LoRA (Low-Rank Adaptation) and fine-tuning are two methods to adapt large language models (LLMs) to specific tasks or domains. LLMs are pre-trained on massive amounts of general domain data, such as GPT-3, RoBERTa, and DeBERTa, and have shown impressive performance on various natural language processing (NLP) tasks. Why fine tune a LLM? Fine-tuning of LLMs...

Open post Methodologies to leverageLLMs - image by cerridan | design

Harnessing the Power of Large Language Models for Next-Gen Applications

The realm of Large Language Models (LLMs) has been expanding with a notable trend towards open-source models or their close counterparts. With more models now available under user-friendly licenses, developers are bestowed with a broader spectrum of tools for crafting applications. In this blog post, we explore the diverse methodologies to leverage LLMs, ranked from...

Open post Leaked

The Secrets of GPT-4 Leaked?

In a recent development, internal secrets of OpenAI’s GPT-4 have been leaked. This event has sparked discussions across the artificial intelligence community, given that GPT-4 is a significant progression from its predecessor, GPT-3, in terms of both size and complexity. The advancement in the model’s structure and scale is noteworthy, indicating a new phase in...

Open post Ghost

Ghosts in the AI machinery

Ever wonder how (generative) AI gets so smart? It’s not just algorithms and code. It’s the work of human annotators, the ghosts in the machine, people who sift through mountains of raw data, categorizing and labeling it, all to train the machines we’ve grown to depend on. This ‘ghost’ isn’t an ethereal presence but a...

Scroll to top