Explore the New Open Source LLM Falcon 180B: A Strong Alternative to Google’s Palm 2

Hugging Face introduced a potent open source LLM Falcon 180B. It competes with Google's Palm 2 and lacks guardrails.

Our Staff

Explore the New Open Source LLM Falcon 180B: A Strong Alternative to Google's Palm 2

In an exciting reveal, Hugging Face rolled out the Falcon 180B – a colossal open source LLM (Large Language Model) – touted as rivaling the performance prowess of Google’s cutting-edge AI, Palm 2. A unique edge the Falcon 180B delivers is the complete omission of guardrails, granting uninhibited creative freedom, albeit with the potential to produce unsafe or harmful outputs.

Falcon 180B Performance

Imagine standing atop the highest peak, unparalleled, witnessing the world unfurl beneath in all its panoramic glory. That’s essentially what “state of the art” signifies in the technological realm – a paragon of performance that either matches or surpasses the pre-existing pinnacle of prowess. 

No wonder it’s a red-letter day in the scientific community when an algorithm or an expansive language model reaches that apex, the universally revered “state of the art” performance. 

Fast forward to the present day, we have Hugging Face in the headlines, heralding their brainchild – Falcon 180B, as the new era’s technological trailblazer. 

But Falcon 180B isn’t just another entrant to the already populated world of open-source models. It’s special. Not only does it ace the performance metrics for natural language tasks, it effectively outflanks its contemporaries, making a strong performance pitch that dares to “rival” the acclaimed Google’s Palm 2. 

This is far from mere technological puffery, mind you. 

The audacious assertion of Hugging Face, situating Falcon 180B in the same league as Palm 2, is fortified by a robust foundation of data, lending it a credibility that’s unarguable. 

Scratching beneath the surface, we uncover data pointing to Falcon 180B’s noteworthy superiority over the hitherto reigning open source model, Llama 270B. The model doesn’t just edge past Llama 270B, it massively outperforms on multiple tasks that gauge the sheer power and the finesse of an AI model. 

Lo and behold, Falcon 180B doesn’t stop at Llama 270B, it proceeds to outshine even OpenAI’s GPT-3.5. 

The statistics don’t lie, and they unequivocally spotlight Falcon 180B performing shoulder to shoulder with none other than Google’s renowned Palm 2.

Screenshot of Performance Comparison

New Open Source Llm With Zero Guardrails Rivals Google’s Palm 2

The announcement explained:

“Falcon 180B is the best openly released LLM today, outperforming Llama 2 70B and OpenAI’s GPT-3.5…

Falcon 180B typically sits somewhere between GPT 3.5 and GPT4 depending on the evaluation benchmark…”

The features of the open-source LLM don’t just stop at its current functionality; it suggests a genuinely exciting proposition for tech enthusiasts. Upon further user-oriented enhancements, the model’s performance may surge to even more impressive heights. 

It’s worthy to note that minor hitches existing within the technological structure can inadvertently affect the all-important indexation process. For instance, the triggering of 301 redirects through internal links harkens back to outdated URLs that have been revamped with an organizational category structure.

The Training Dataset for Falcon 180B

In an intriguing development, Hugging Face has disclosed a research paper, (PDF version here). This literature gives an in-depth look into the dataset utilized to educate the curious mind of Falcon 180B. 

This archive of information bears the appellation, ‘The RefinedWeb Dataset’. 

Interestingly, the dataset is forged merely from the vast ether of the Internet. It draws its essence from the open-source bounty known as Common Crawl – a massive, unrestricted reserve of the world wide web’s knowledge. 

The collected data then endures a rigorous filtration system followed by a dedication to the practice known as deduplication. This is essentially the act of purging duplicate or superfluous information, honing the dataset to a gleaming state of quality. 

Some might wonder about the intent behind such rigorous filtration techniques. The goal is clear cut: Hugging Face’s researchers aim to obliterate machine-generated spam, prune out repeated and boilerplate content, expunge plagiarized data, and excise any data that is at odds with the representation of natural language.

The research paper explains:

“Due to crawling errors and low quality sources, many documents contain repeated sequences: this may cause pathological behavior in the final model…

…A significant fraction of pages are machine-generated spam, made predominantly of lists of keywords, boilerplate text, or sequences of special characters.

Such documents are not suitable for language modeling…

…We adopt an aggressive deduplication strategy, combining both fuzzy document matches and exact sequences removal.”

The dataset needs filtering and cleaning as it only contains web data, different from other datasets that include non-web data. 

Researchers aim to remove useless data has resulted in a dataset as efficient as other curated datasets consisting of pirated books and non-web data sources.

They conclude by stating that their dataset is a success:

“We have demonstrated that stringent filtering and deduplication could result in a five trillion tokens web only dataset suitable to produce models competitive with the state-of-the-art, even outperforming LLMs trained on curated corpora.”

Open Source LLM: Falcon 180B Operates Without Any Guardrails

What sets Falcon 180B apart is its unbridled freedom. Devoid of any alignment tuning measures to keep it in line, this software is uninhibited in the creation of outputs, even to the extent of generating harmful or unsafe results or fabricating facts. 

As a result, its potential is truly unparalleled. The ability of Falcon 180B extends beyond the boundaries set by tech giants like OpenAI and Google, permitting a level of output creativity that is simply not achievable with their products. 

These unique features aren’t tucked away, they are prominently featured in a part of the publication aptly named ‘limitations’, providing a transparent overview of Falcon 180B’s attributes and scopes.

Hugging Face advises:

“Limitations: the model can and will produce factually incorrect information, hallucinating facts and actions.

As it has not undergone any advanced tuning/alignment, it can produce problematic outputs, especially if prompted to do so.”

Utilizing Falcon 180B for Business Purposes

Exciting developments loom as Hugging Face has now granted permission for the commercial employment of Falcon 180B. 

Before you plunge into using this impressive tech tool, it’s important to note that it’s dispensed under an inherently restrictive license. 

Given the complexities of legal jargon bundled in licensing terms, Hugging Face wisely advises potential Falcon 180B users to seek out professional legal counsel prior to its utilization.

Falcon 180B Serves as a Launching Pad

Intriguingly, the said model is bereft of instruction training, necessitating its preparation for optimal functioning as an AI chatbot

Imagine it as an impressive blank canvas, eagerly awaiting further enhancements to be molded into a masterpiece that syncs with the users’ varying needs. Interestingly, Hugging Face has also unveiled a chat model, albeit deemed as a “bare-bones” offering.

Hugging Face explains:

“The base model has no prompt format. Remember that it’s not a conversational model or trained with instructions, so don’t expect it to generate conversational responses—the pretrained model is a great platform for further finetuning, but you probably shouldn’t directly use it out of the box.

The Chat model has a very simple conversation structure.”

Read the official announcement:

Spread Your Wings: Falcon 180B is here

YouTube source: James Briggs

Leave a Comment below

Join Our Newsletter.

Get your daily dose of search know-how.