SenseTime Files for IPO; Meet Efficient Transformer FastFormer; What's Next for China AI
China’s AI news in the week of August 29, 2021
SenseTime Set to Raise $2Bn In the Biggest IPO of AI Company
What’s new: Chinese leading AI company SenseTime just filed for Hong Kong IPO to raise $2 billion. 60% of the IPO funding will be used for R&D including computing resources, AI chips, models, and products, while the rest will be allocated for business and investment purposes. Btw, the company’s Chinese name Shang Tang originates from the great Chinese emperor Shang of Tang, who is the first king of the Shang dynasty in Chinese history and overthrew Jie, the last ruler of the Xia dynasty.
Revenue: SenseTime reported revenue of RMB1,651.8 million yuan in the first half of 2021, an increase of 91.8%. Its revenue in 2020 is RMB3,446.2 million yuan, which is significantly lower than the projected RMB9 billion yuan by Bloomberg, while its adjusted net loss in 2020 is RMB878.4 million.
Business Models: The global AI software market size is expected to reach USD121.8 billion in 2025, growing at a CAGR of 31.9% from 2020, according to the prospectus. As a leading AI and computer vision service provider, SenseTime offers products and services in four sectors:
Smart business: SenseFoundry-Enterprise is a one-stop software platform to help applications handle real-world data perception and process automation across industries.
Smart City: SenseFoundry monitors the conditions of public facilities, detects incidents, and tracks the impact of natural disasters in 119 cities in China and overseas.
Smart Life: SenseME and SenseMARS create the interface connecting the physical and digital worlds by empowering more than 200 types of mobile phones, AR and VR glasses, smart screens, and consumer drones. Its AI software platform for smart healthcare, SenseCare, provides AI tools in diagnosis, treatment planning, and rehabilitation.
Smart Auto: The Smart Auto platform provides companies and automakers with ADAS and other AI capabilities.
Staff: Founded in 2014 by Professor Tang Xiao’ou, a leading AI scientist who revolutionized facial recognition tech with deep learning, and his apprentices, SenseTime now assembles 40 professors and 5,000 employees, of which two of thirds are scientists and researchers.
Shares: Dr. Tang holds 21.73% of the total shares while other co-founders hold 12.17%. The company has raised $5.2 billion led by SoftBank, XXXX. Softbank is the largest outside shareholder of SenseTime with 14.88% of the total shares, followed by Alibaba’s Taobao and Chunhua Capital.
Fastformer: The Most Efficient Transformer Architecture
Deja Vu: A team of Tsinghua University researchers recently proposed FastFormer, an efficient Transformer model based on additive attention for the long text modeling task. Experiments on five datasets show that Fastformer is much more efficient than many existing Transformer models and can meanwhile achieve comparable or even better long text modeling performance.
Additive Attention: Long text modeling task poses a great challenge for Transformer-based models to efficiently process.
In the basic Transformer architecture where an input sequence is transformed into another layer of sequence, each token (a word for example) will go through a weighted formula sum of its three elements, or technically vectors - key, value, and query. The key, value, and query of each token will interact with others to understand its context, content, and distribution. So the computational complexity is quadratic to the sequence length.
In the paper Fastformer: Additive Attention Can Be All You Need, authors proposed instead of interacting with each token’s key, value, and query, the model should (as shown in the image above):
Form the query sequence into a global query vector first
Then, model the interaction between the global query vector and attention keys into a global key vector
Model the interactions between global key and attention values and uses a linear transformation to learn global context-aware attention values
Add them with the attention query to form the final output.
Authors conducted experiments on five benchmarks - IMDB, MIND, CNN/DailyMail, PubMed, and Amazon - and demonstrated that Fastformer is much more efficient than many existing Transformer models including BigBird and Linformer. Fastformer can also achieve competitive or even better performance in long text modeling.
Alibaba Head of Machine Intelligence Technology: What’s Next for China AI?
Op-ed: Dr. Jin Rong, Head of Machine Intelligence Technology at DAMO Academy, Alibaba’s research facility, recently posted an op-ed elaborating his forward-looking views on China’s contemporary AI. In short, he believes:
The era of AI, which is still nascent stage, has just begun as Faraday discovered electromagnetic induction. It has not yet evolved from technology into science.
Despite the unprecedented progress deep learning-based AI has achieved in recent years, the success of AI is also a matter of luck in certain degrees as the principle of AI has not been known so far.
Fundamental understanding of deep learning, self-supervised learning, and few-shot learning represent three breakthrough technologies of deep learning moving forward.
AI for science will be the biggest opportunity of AI.
Investment News:
Xiaodu Technology, a Baidu subsidiary developing virtual assistant AI and smart hardware, has closed Series B financing at a $5.1 billion post-money valuation. Baidu remains a supermajority shareholder after the funding.
Shortly after reporting a dazzling quarterly financial result, Xiaomi announced it will acquire Deepmotion, a Beijing-based autonomous driving company offering intelligent ADAS, for RMB500 million yuan ($77.3 million). The latest acquisition adds an important piece to Xiaomi’s EV puzzle. The Chinese smart hardware giant, now the world’s second-largest smartphone maker, has also invested in self-driving tech firm Zongmu Technology, LiDAR maker Hesai Technology, battery tech company Svolt, and smart parking company AIpark in recent months to catch up with competitors.
Pimchip, a Beijing-based startup specializing in AI chips that combines memory and computing, announced it has raised nearly 10 million dollars in its Series Pre-A funding round led by Puhua Capital and Sequoia China. Founded in 2021, the company is said to develop multiple SRAM-based processing In memory (PIM as the company stands for) chips while exploring neuromorphic computing technologies.