When we talk about AI, most people immediately think about Google or Openi, behind Chat GPT. But while both of them often get all the headlines, Mark Zuckerberg’s meta is also moving rapidly, and now it is making some serious noise with two major AI announcements.
The company behind Meta, Facebook, Instagram, WhatsApp and Threads has just unveiled a new AI model that can in fact think before it works and a new video editing feature that is run by AI. The purpose of the latter is clearly the purpose of billions of people using a meta platform for billions of people.
But let’s start with the start of what seems straight from the science-fi film-the new model is called VGP2 and it is mainly the brain of robots and AI agents. Passage? To understand and predict the physical world, to help them, how it reacts to their actions, just as we humans do without thinking about it.
When we pass through a crowd, we are permanently predicting what is going to happen – avoiding people, lubricating obstacles and moving towards our cause. We do not stop to analyze every move, we just know what is likely to happen. Well, according to Meta, Vijepa 2 is called the World Model, which is designed to teach AI a similar intuitive by construction.
This global model allows AI to do three basic work: understand, predict and plan. VGAP2 is trained on video footage, which shows how items move, how people interact with these things and how everything is treated in the physical world.
It builds the original V-JEPA model last year, but now it is better to understand an unknown environment-like when a robot competition is a new place or work.
Meta says she tested V-Jepa 2 in the lab and managed to do things like using the model, reaching robots, catching things and rotating them. This may seem basic, but in the world of robotics, this is a great thing.
Of course, Meta is not the only company that chases this type of AI. Google launched its Gemini 2.0 model last year, focusing on improving AI in reasoning, remembering and planning. Openi is also in the game with his AI agent who can manage tasks for you. However, it seems that the meta -use is bent in helpful cases – but at the end of the day, no one really knows how it all ends.
It is clear that we are moving towards a future where AI does not respond to indicators – it really begins to work for us. And yes, this is interesting and a little nerve disorder. On the one hand, these tools can help those who really need them. On the other hand, there is always a risk that we are at risk of relying. What happens when AI starts thinking instead of us?
Looking forward
Now you can edit videos with Meta AI
Although VGP 2 is about to understand the real world, the second announcement of Meta is focused on how you can create your digital. It has just developed a perfectly new AI -powered video editing feature that is already crossing the Meta AIAP, Meta.A website and a dedicated new app called Edit.
This tool allows you to remax short-form videos using the preset indicator that can completely change your clothing, background, vibes, even the entire style of the clip. It is now available in the United States and more than a dozen countries.
AI will help you edit your trains. | Image Credit – Meta
Inspired by Meta’s movie general models, this feature is just the beginning. Meta says that later this year, you will be able to use your text to edit your custom videos directly, along with Meta AI.
The editing process is easy: Upload the video to one of the auxiliary platforms, then browse with more than 50 editing indications. For now, you can change your video for 10 seconds for free-but this is a limited time thing.
Now, you get 50 amendments indications. | Image Credit – Meta
You can transform your clip into a retro comedy book scene, which is complete with vintage -style reflection. Or change cloudy video mode with imaginary brightness and soft focus lighting. Even you can feel like a newly soaked video game, in which your clothes and environment meet the theme.
Once you work, you can divide your creation directly on Facebook or Instagram with a MATA AI app or editing. If you are at meta or using the app, you can post on the discourse feed.
And when it may look like fun – and yes, that’s definitely – it’s also another reminder where we’re going. Just as Google’s new flu tool that can produce hyper realistic videos, such as AI -powered editors can blur the line between what is not true and what is not. We have already seen Deep Fake Style videos viral and tricks. And of course, the meta device is for creative amendments, not cheating – but I think it’s still a step from the same path.
Read the latest from Tsveta Ermenkova


