in reply to Tamas G

Nah. Giving LLMs their identity is done in post-training, and Google can just remove that data for their posttraining mix for Apple. They're definitely going to have one. Since pretraining = ~60-90% of compute, redoing that part won't be that expensive. There are rumors that Google will only ship the base models anyway, and that all the fine tuning will be done internally by Apple (perhaps on Google infrastructure). Those base models have no idea what they are yet.