GPT-4o Image Generation - Images (27 results)
Disaster GIrl meme (4o Image Generation version)



GPT-4o Image Generation
This is fine (4o Image Generation version)



GPT-4o Image Generation
OpenAI’s new 4o image generation is insane.



GPT-4o Image Generation
OpenAI’s new 4o image generation is insane.



GPT-4o Image Generation
OpenAI’s new 4o image generation is insane.



GPT-4o Image Generation
New OpenAI image generation has no celebrity filter!!



GPT-4o Image Generation
GPT-4o with image generation is actually insane. These are not real images!
![A wide image taken with a phone of a glass whiteboard, in a room overlooking the Bay Bridge. The field of view shows a woman writing, sporting a tshirt wiith a large OpenAl logo. The handwriting looks natural and a bit messy, and we see the photographer's reflection. Transfer between Modalities: Suppose we directly model p(text, pixels, sound) Fixes: model compressed representations +Compose autoregressive prior with a with one big autoregressive transformer. powerful decoder Pros: • image generation augment nowledge next level text renderin Read more native in-context lear •unified post-training Cons: bit-rate d varying Compute not ada OpenAL →>> okens [transformer][diffusion] pixels Best of 8 selfie view of the photographer, as she turns around to high five him Transfer between Modalities: Suppose we directly model Pros: p(text, pixels, sound) with one big autoregressive prior image gen next-lev native i • unified Cons: · varyin Con ed with world kno Fixes: model compressed re •Compose autoregress with a powerful da tokens [transf](/assets/image-covers/nsfw.png)
![A wide image taken with a phone of a glass whiteboard, in a room overlooking the Bay Bridge. The field of view shows a woman writing, sporting a tshirt wiith a large OpenAl logo. The handwriting looks natural and a bit messy, and we see the photographer's reflection. Transfer between Modalities: Suppose we directly model p(text, pixels, sound) Fixes: model compressed representations +Compose autoregressive prior with a with one big autoregressive transformer. powerful decoder Pros: • image generation augment nowledge next level text renderin Read more native in-context lear •unified post-training Cons: bit-rate d varying Compute not ada OpenAL →>> okens [transformer][diffusion] pixels Best of 8 selfie view of the photographer, as she turns around to high five him Transfer between Modalities: Suppose we directly model Pros: p(text, pixels, sound) with one big autoregressive prior image gen next-lev native i • unified Cons: · varyin Con ed with world kno Fixes: model compressed re •Compose autoregress with a powerful da tokens [transf](https://i.kym-cdn.com/photos/images/masonry/003/035/999/f1c.jpg)
![A wide image taken with a phone of a glass whiteboard, in a room overlooking the Bay Bridge. The field of view shows a woman writing, sporting a tshirt wiith a large OpenAl logo. The handwriting looks natural and a bit messy, and we see the photographer's reflection. Transfer between Modalities: Suppose we directly model p(text, pixels, sound) Fixes: model compressed representations +Compose autoregressive prior with a with one big autoregressive transformer. powerful decoder Pros: • image generation augment nowledge next level text renderin Read more native in-context lear •unified post-training Cons: bit-rate d varying Compute not ada OpenAL →>> okens [transformer][diffusion] pixels Best of 8 selfie view of the photographer, as she turns around to high five him Transfer between Modalities: Suppose we directly model Pros: p(text, pixels, sound) with one big autoregressive prior image gen next-lev native i • unified Cons: · varyin Con ed with world kno Fixes: model compressed re •Compose autoregress with a powerful da tokens [transf](https://i.kym-cdn.com/photos/images/original/003/035/999/f1c.jpg)
GPT-4o Image Generation