AI racer ravages mankind to ascend Nature!1000 PS4 training, extreme overtaking monopolized track

2022-04-28 0 By

Human game player’s dignity, and collapse of a corner, and this is the game manufacturer father personally end humiliation: SONY developed AI player, beat the world’s top GT racing game e-campaign player.News of highly capable AI beating humans in all kinds of games has been coming up a lot in recent years.Whether in the early years of chess, trivia, or in recent years of Go, Starcraft, tower of knives 2.In front of the AI players, the human professional players were literally beaten to tears.Today, AI racers once again beat human players on the cover of Nature!SONY’s next AI, GT Sophy, adds video racing games to that list: no headheader in Japan or any other country could beat SONY’s AI trained on 1,000 PS4 consoles in a GT Racer.GT Sophy, or Sofie as it is affectionately known, was developed by SONY’s AI division, PDI Studios and interactive Entertainment unit over the past five years.AI Division provides deep learning algorithm and is responsible for training model;PDI’s games serve as AI training and validation environments;Interactive entertainment’s cloud computing architecture connects thousands of PS4 consoles as a large-scale training architecture and platform.As a neural network driver, GT Sophy demonstrates exceptional driving speed, handling ability and driving tactics while adhering to racing rules.”The ability of AI drivers to outperform human players so skilfully in this head-to-head competition represents a landmark achievement for ARTIFICIAL intelligence,” Wrote Stanford University professor Christian Gerdes in an article for Nature.What’s so great about the Fast and the Furious in Dragon’s Path??The goal of motor racing is simple: you win by getting ahead of your competitors in less time.While the hardcore clash of the Fast and the Furious is thrilling, the ultimate battle on the field requires solid tactics.In Tokyo, SONY’s PDI studio invited the world’s leading esport players to compete against GT Sophy and its variants.GT Sophy and human contestants competed in their first ever competition on the Dragon Trail.The track is 5,209 meters long, divided into three sections S1, S2, S3, and 17 corners.The starting position of the race is arranged in a staggered sequence of AI and human beings, with a total of 8 drivers.Once the race began, however, an ARTIFICIAL intelligence variant called GT Sophy named Violette was able to overtake human riders and move into second place.AI team mates Bordeaux have held on to the top spot.At turn T2/3, AI rider Verte swished in front of human rider Tomohiro Yamanaka.In the middle of the race, Sophy AI, who ranked first and second, chose the best route without blocking.In the final section S3, the drivers needed to round the big T17 curve to make a quick dash to the finish line.GT Sophy occupied two of the top three spots, with only Human racer Kuniita Ryuta topping the top three.Kuniita has always been a calm and composed racer, winning the fifth round of the FIA Sports Car Championship Tokyo Nations Cup in 2019 and the third place in the 2020 FIA GT Championship Nations Cup.The result of the production, in fact, from the qualification can see some clues.The AI player on the right of the screen is not only more stable in corners, but also takes significantly better routes than the human player.In another race on the lake Maggiore track in the game, four AI defeated four human opponents.The most prominent AI player in the race, GT Rog, was in first place for all three of the race’s scoring laps, more than five seconds ahead of the fastest human.However, on lake Maggiore, there were also AI driver mistakes, understeer when cornering, head into the wall.In July 2021, GT Sophy was only able to run faster than a human race on a virtual track with no other cars.By October 2021, you’ll be able to beat a bunch of human opponents in real game racing.”AI can drive in ways we couldn’t even imagine,” said Takuma Miyagun, a star of the 2020 Gran Turismo World Cup.But he admits GT Sophy’s tactical decisions are useful.Takuma Miyagun’s disobedient face after being defeated by AI shows that racing is not only a matter of speed and reaction time, but also a test of one’s extreme strategy, so it is not easy for a machine to master it.As Nature notes, achieving this requires overcoming extremely complex physical challenges, as racing on a track requires careful use of the “limited friction” between tires and the ground.Braking with friction, for example, reduces the force needed to go around corners.More specifically, each grounding tire can generate a friction, or load, proportional to the vertical force.As the car accelerates, the load is transferred to the rear tire and friction on the front tire decreases.This can lead to understeering, in which the steering wheel does not generate much cornering force and can maintain a de facto handbrake when coming out of corners.Conversely, when the car brakes, the load is transferred to the front of the car.This can lead to oversteer, which means the rear tire loses traction and the car spins violently.Add to this the complex track terrain, as well as the complexity of adjusting load transfer with suspended vehicles, and the challenges of the car become apparent.To win the race, the driver must choose a trajectory that keeps the car within the friction limit as much as possible.If you brake too early in the turn, your car will slow down and waste time.Brake too late and you won’t have enough cornering force to maintain the line you want.Plus, braking too hard can cause the car to spin.Although the limits of car handling are complex, they are well explained by physics.So it stands to reason that they can be calculated or learned.In a head-to-head race, GT Sophy, instead of taking advantage of her lap time to beat her rival, breezed past it in the end.For example, in the final sprint of the first race, two human drivers tried to block the path of two AI drivers.GT Sophy managed to find two different routes to overtake the human player to the finish line.After training with a neural network model, GT Sophy learned to take different routes through corners in different situations.So, how did this AI Superman racer come into being?How to become an AI racer?Unlike other games with more fixed rules and gameplay, gran Turismo games have very open tactical options.Gran Turismo games also feature good simulations of the laws of real-world physics.So it’s not easy to get AI to play a gran Turismo game with both virtual and real world difficulty.First, we need a hyperrealistic simulator as a training environment.Gran Turismo┬« Sport (GT Sport) is a PlayStation 4 driving simulator designed by Polyphony Digital in collaboration with FIA.GT Sport has clear rules and criteria to ensure a level playing field without “cheetah”.(DOGE) In addition, GT Sport recreates the real-world racing environment as realistically as possible, including its cars, track and even physics such as air resistance and tyre friction.Under the carmaker’s guidance, details of the car were reproduced accurately, from the curves of the body to the width of the gaps between the panels and the shape of the turn signals and headlights.After the game environment has, it is necessary to configure the training environment.DART is SONY AI’s network architecture tailored for this purpose, allowing researchers to train GT Sophy for a long period of time by connecting 1,000 PS4 consoles to interactive Entertainment’s cloud computing gaming platform.On such an architecture, all computing resources between different data centers are effectively integrated.Researchers can easily determine experimental parameters, set up experiments to run automatically when cloud resources are available, and collect data that can be viewed in a browser.From this platform, the researchers smoothly performed hundreds of simulations, improving the AI’s techniques and tactics to unprecedented heights.Finally, it’s time to train AI driver GT Sophy.Like its AI predecessors who beat humans in games, GT Sophy is trained in deep reinforcement learning to avoid having to manually encode game behavior into a large and complex data set of behavioral rules in the first place.When an intelligence, GT Sophy, takes actions in a training environment, algorithms reward or punish it based on the results it leads to.After receiving a reward (or punishment), GT Sophy updates its knowledge of the world to determine its next move.SONY AI researchers and engineers have developed innovative reinforcement learning technologies, including a new training algorithm called QR-SAC that gives AI rational consequences within the rules and physical constraints of various high-speed driving decisions.Code with racing rules intelligible to the agent, and obtain a training program that promotes subtle racing skills.In reinforcement learning, AI racers need to consider the long-term consequences of their behavior and can independently collect their own data during the learning process, avoiding the need for complex hand-coded behavioral rules.Of course, tackling a complex field like Gran Turismo still requires the same complex and nuanced algorithms, rewards, and training scenarios.At a later stage of the training, the researchers added a number of different competitors to exercise GT Sophy against human drivers, and the results showed that SONY’s algorithm was superior.After just a few hours of training, GT Sophy was already on the track and was faster “within a day or two” than 95 per cent of the drivers in its training data set.Of course, 95% is far from enough.After training for a further 45,000 hours, GT Sophy was able to outperform human racers on three tracks: Croatia’s Dragon Beach Track, Italy’s Lake Maggiore Grand Prix track and France’s Salte Track.However, when it comes to competing with humans, AI has many innate advantages, such as a perfect memory and fast reaction time.GT Sophy, in particular, has an accurate track map with coordinates of track boundaries and “precise information about the load of each tyre, the sliding Angle of each tyre and the state of other vehicles”.However, limits can still be placed on two other factors: frequency of action and reaction time.GT Sophy’s input is limited to 10 Hz, the theoretical maximum for a human is 60 Hz, which sometimes allows human riders to exhibit “smoother movements” at high speeds.In terms of reaction time, GT Sophy is able to react to events in a competition environment in 23-30 milliseconds, which is much faster than the maximum estimated reaction time of 200-250 milliseconds for professional athletes.To compensate, the researchers added artificial delays, training GT Sophy to respond at 100, 200 and 250 milliseconds, respectively.Even so, GT Sophy achieved “superhuman laps in all three tests”.The head of SONY’s AI division admits that it will be difficult for the AI to learn to play civilly, and that it will be a huge task to make tactical decisions against opponents without being too angry or too cautious.First of all, the AI should learn to drive on the basis of understanding the virtual car position, virtual pneumatic model, track graphics and basic driving actions in the game environment.Then there are the various gran Turismo game tactics, such as slipstream obstacles, cable overtaking, various blocking jams;Finally, the AI should learn the necessary rules of etiquette on the track, such as avoiding flagrant fouls and respecting the safety of the opponent’s lane.Racing control QR-SAC algorithm can clearly deduce the various possible outcomes of GT Sophy high-speed action.Accounting for the consequences of driving and the uncertainties involved helps the GT Sophie take corners at physical limits and consider complex possibilities when racing against different types of opponents.Racing tactics incorporate training in mixed scenarios, using artificial race situations that may be critical at each track, and specialized opponents that help the intelligence learn these skills.These skills-building programmes have helped GT Sophy acquire professional racing techniques, including dealing with crowded starts, counter-slipstream obstacles with catapulted overtaking, and defensive maneuvers.Racing Etiquette To help GT Sophy learn racetrack etiquette, SONY AI researchers found a way to encode the written and unwritten rules of racing cars into complex reward functions.The team also found it necessary to balance the number of opponents to ensure GT Sophy was just competitive in training competitions and did not become too aggressive or timid when competing against humans.References: