NYU and Pompeu Fabra University unveiled “Meta-learning for Compositionality” (MLC), a technique advancing neural networks’ compositional skills. Brenden Lake states, “A generic neural network can mimic or exceed human systematic generalization.” Featured in Nature, MLC surpasses models like ChatGPT. Marco Baroni suggests MLC can enhance large models’ compositional capabilities.
Source: Science Daily