Join daily and weekly newsletters to obtain the latest updates and exclusive content to cover the leading artificial intelligence in the industry. Learn more
When Deepseek-R1 appeared for the first time, the prevailing fear that shook the industry was that advanced thinking could be achieved with less infrastructure.
As it turned out, this is not necessarily the case. At least, according to Together, Amnesty InternationalThe appearance of Deepseek and open source thinking was completely the opposite effect: instead of reducing the need for infrastructure, it increases this.
This increased demand helped fuel the growth of the artificial intelligence platform and business. The company announced today a 305 million dollar financing round, led by General Catalyst and Co-Lered by Prosperity7. AI appeared together for the first time in 2023 with the aim of simplifying the Foundation’s use of large open source language models (LLMS). The company expanded in 2024 with the Enterprise platform together, which allows the spread of artificial intelligence in virtual private environments (VPC) and local environments. In 2025, AI together develops its platform again through the thinking groups and AI Agentic capabilities.
The company claims that the platform for spreading artificial intelligence has more than 450,000 registered developers and that the company has grown 6X on an annual basis. The company’s customers include companies as well as startups such as Krea Ai, Captions and Pika Labs.
“
The huge effect of Deepsek-R1 caused by the demand for Amnesty International Infrastructure
Deepsek-R1 was very troubled when it appeared for the first time, for several reasons-one of which was the implicit meaning that the thinking model in the open source edge could be built and published with a basic structure less than the ownership model.
However, Prakash explained, as Amnesty International has grown partially to help support the increasing demand for Deepseek-R1.
“It is a somewhat costly model to run inference,” he said. “It has 671 billion teachers and needs to be distributed on multiple servers. Because quality is higher, there is more demand for the upper limb, which means you need more capacity.”
In addition, he pointed out that Deepseek-R1 usually has longer requests that can last two to three minutes. The huge user’s request to Deepseek-R1 increases the need for more infrastructure.
To meet this request, together it provided a service called “Thinking Groups” that allocates a customized capacity, ranging from 128 to 2000 slides, to run models in the best possible performance.
How to help AI together organizations use intelligence thinking intelligence
There are a number of specific areas where AI sees the use of thinking forms. These include:
- Coding agents: Thinking models help divide the biggest problems into steps.
- Reducing hallucinations: The thinking process helps to verify the outputs of models, thus reducing hallucinations, which is important for applications where accuracy is very important.
- Improving non -glamorous models: Customers distil them and improve the quality of non -metal models.
- Empowerment of self -improvement: The use of learning allows reinforcement with thinking models forms to self -start repeatedly without relying on large amounts of data that carry human signs.
Artificial intelligence also leads an increase in the demand for Amnesty International’s infrastructure
Together, artificial intelligence also witnesses an increased demand for infrastructure as its customer users adopt artificial intelligence.
Prakash explained that the workflow tasks in the agents, as one user’s request results in thousands of API calls to complete the task, put more demand for AI’s infrastructure.
To help support AI AIC work, together Amnesty International recently gained codesandboxThe technique provides lightweight and fast virtual tools (VMS) to implement a safe and safe code within the AI cloud together, where the language models are also found. This allows both artificial intelligence to reduce cumin between the code and the models that need to be called, and improve the functioning of the agent’s workflow.
Nafidia Blackwell has already an effect
All artificial intelligence platforms face increasing.
This is one of the reasons why Nvidia continues to put the new silicone that provides more performance. The latest NVIDIA product slide is GPU Blackwell, which is now published in AI together.
Prakash said that Nafidia Blackwell chips cost about 25 % more than the previous generation, but it provides 2X performance. GB 200 platform with Blackwell chips is particularly suitable for training and inferred expert models (MEE), which are trained via multiple servers of Infiniband. He pointed out that Blackwell chips are also expected to provide a larger payment of performance to conclude the largest models, compared to smaller models.
The competitive scene of the artificial intelligence agent
Amnesty International Infrastructure Platform Market.
Together, you face competition from each of the applicable cloud service providers and startups for Amnesty International’s infrastructure. All players, including Microsoft, AWS and Google, have Amnesty International platforms. There is also an emerging category of players who focused on artificial intelligence such as Groq and Samba Nova, which are all aimed at a slice of the profitable market.
TOGETAR AI has a full show, including GPU with the software platform layers on top. This allows customers to build open source models or develop their own models on the AI platform together. The company also focuses on developing improvements and rapid operating times for both inference and training.
“For example, we serve the Deepseek-R1 model at 85 symbols per second, and it is presented by Azure in 7 code per second.” “There is a somewhat expanding gap in the performance and the cost that we can provide to our customers.”