编组
ConsoleUser Center

Mengzi Model

Mengzi pre-training model is a large-scale pre-training language model developed by Langboat team. It can process multi-language and multi-modal data, support a variety of understanding and generation tasks, and can quickly meet the needs of different fields and different application scenarios.

Product Advantages

mengzi-advantage

Support Multiple Model Architectures

  • Autoregressive models: such as GPT
  • Self-encoding models: such as BERT
  • Encoder-Decoder model: T5
mengzi-advantage

Lightweight Model Performance Enhancement

  • Fusion of multiple pre-training tasks
  • SMART adverserial training
  • Knowledge distillation
mengzi-advantage

Knowledge Graph Based Enhancement

  • Enhancements with entity extraction
  • Knowledge graph enhancement (isa relationship)
  • Knowledge graph to text conversion
mengzi-advantage

Linguistic Knowledge Based Enhancement

  • Mask mechanism enhanced by syntactic information
  • Semantic role embedding enhancement
  • Attention weight constrained pruning of dependencies
mengzi-advantage

Few-Shot/Zero-Shot Learning

  • Prompt template construction
  • Multi-task learning technique
  • Common information extraction scenarios, out of the box
mengzi-advantage

Retrieval Based Enhancement

  • Knowledge decoupling
  • Strong interpretability
  • External knowledge components are updated in real time

Product List

Mengzi Generative Pre-trained Transformer Model (Mengzi GPT)

Mengzi Generative Pre-trained Transformer Model (Mengzi-GPT) is a controllable large-scale language model designed for generation scenarios. It can assist users in completing various tasks in specific scenarios through multi-turn interactions.

Learn More

Mengzi-BERT-base

It is suitable for natural language understanding tasks such as text classification, entity recognition, relationship extraction, and reading comprehension.

Learn More

Mengzi-BERT-base-fin

Training based on financial corpus can better complete tasks in financial scenarios.

Learn More

Mengzi-T5-base

It is suitable for controllable text generation tasks such as copywriting generation and news generation.

Learn More

Mengzi-T5-base-MT

Based on the Mengzi-T5-base multi-task model, various tasks can be completed through prompts without training.

Learn More

Mengzi-GPT-neo-base

Based on Chinese corpus training from scratch, it is suitable for tasks such as text continuation and novel generation.

Learn More

Guohua Diffusion

The Chinese painting-style text-to-image generation model can complete different generation tasks such as movie posters, album covers, and landscape paintings.

Learn More

Sentence Embedding

Based on Langboat's self-developed Mengzi lightweight technology system, it outputs a vector representation of sentence-level text, which can capture the semantic similarity between texts. It is suitable for clustering, regression, anomaly detection, visualization and other tasks.

Learn More

Large-scale Model Customization

Based on the Mengzi pre-training model technology developed by Langboat, it provides enterprise customers with faster, more effective and low-cost pre-training model customized training, optimization and deployment services.

Learn More

Get the Ability of Mengzi Pre-training!

Products

Business Cooperation Email

bd@langboat.com

ewm

Address

11F, Block A, Dinghao DH3 Building, No.3 Haidian Street, Haidian District, Beijing, China


© 2023, Langboat Co., Limited. All rights reserved.

Business Cooperation:

bd@langboat.com

Address:

11F, Block A, Dinghao DH3 Building, No.3 Haidian Street, Haidian District, Beijing, China

Official Accounts:

ewm

© 2023, Langboat Co., Limited. All rights reserved.
support
business