Rob Chatfield – Senior iOS Engineer
Jira has been on the web for years and recently we have put a lot of effort into mobile, like our iOS app that offers quick and easy access to your projects on the go.
Many of you have asked us for a similar experience on the Mac. Well, today, I am thrilled to announce that we have done just that.
We were impressed that we were able to get the Jira project up and running in just a few days and it’s a huge project.
Now, our iOS devs are Mac devs too. One code base, one team.
Here is a preview.
The boards view looks amazing on the Mac display with plenty of room for multitasking, and the benefit of this being a native app is that it’s insanely fast. We can switch between these views instantly which is a level of performance you just don’t get on the Web.
And this is one project, so all of the work we did on iPad automatically translated to the Mac, like keyboard shortcuts, Spotlight, and Drag and Drop.
And it was easy to add those finishing touches that make it perfect for the desktop. For example, we added our top tasks into the custom toolbar. All of this never would have happened without Project Catalyst. We are super excited to be bringing the Jira app to Mac and can’t wait for you to try it.
So, look for Jira Cloud in the Mac App Store later this year.
Thank you very much.
Craig Federighi – SVP, Software Engineering, Apple Inc.
So, we are really excited about this technology and excited to see what apps you all bring to the Mac App Store this fall.
So that’s macOS Catalina.
Sidecar, strong security improvements, great new system apps and Project Catalyst bringing a whole new way to bring new apps to the Mac.
You know, you have seen a lot today in terms of tools for pro users, and you know who our biggest group of pros is? Well, it’s developers, all of you.
And, you know, everything we do for developers is focused on the goal of helping you create the greatest and most innovative apps and we do this by providing tools and technologies designed to give you a head start.
Well, this year we have a whole bunch to talk about, like Metal and Core ML, but we are going to cover that later in the conference.
Today I want to focus on just two areas: AR and Swift.
And we will start with AR.
Today, we have three really big announcements, starting with RealityKit. So, you know, creating complex 3D environments can require deep knowledge of 3D modeling and mastery of sophisticated gaming engines like Unity and Unreal, but what about the developers who want to incorporate 3D and AR in their apps but they don’t have that experience?
Well, that’s where RealityKit comes in. It’s built from the ground up for AR with photo realistic rendering, amazing effects like camera motion blur and noise, and it is seamlessly integrated into ARKit and has a native Swift API.
Now, how do you actually model your 3D content?
Well, that’s where Reality Composer comes in. Reality composer is a new app featuring Drag and Drop – a Drag and Drop interface and a library of high quality objects and animations making it incredibly fast and easy to build an interactive environment.
Now, of course, it’s integrated with XCode, but it’s also available on iOS. So, you can edit, test, and tune your app right on the device where it will ultimately be delivered.
Now, of course, the foundation of AR on iOS is ARKit, and ARKit is a major update. I want to focus on one area around the improvement in the way people are handled in AR scenes starting with people occlusion. Now, this is insane. What used to require painstaking compositing by hand can now be done in real time.
Now, by knowing where these people are in the scenes you can see we can layer virtual content in front and behind them. And check this out; Motion Capture. Just point your camera at a person and we can track in real-time the positions of their head, their torso and their limbs and feed it as an input into the AR experience.
Now, developers are going to do amazing things with ARKit 3, and that brings us to Minecraft. Minecraft is the bestselling game of all time, and a great platform for creativity.
So, to see how ARKit is enabling Minecraft to go to the next level, I’m happy to welcome to the stage, Lydia Winters and Saxs Persson from Mojang.
Lydia Winters: Thanks, Craig.
Our amazing community inspires us to push boundaries and build unique experiences. We are here today to show you our newest creation.
Saxs Persson: We have been dreaming about a Minecraft in the real world and Minecraft in Augmented Reality. We share a vision with Apple, how AR can uniquely connect us to the world and today technology is ready for that vision. We built something we think is pretty special.
Lydia Winters: We are excited to demo gameplay for the very first time right here on stage. This is Minecraft Earth.
Saxs Persson: Let’s start with the basics. This is a real living, breathing Minecraft world right on your tabletop with red stone circuitry, fireworks, flowing water. You can break anything, you can play with your mobs, and when I look at Lydia, I can see what tool she is holding and I can see her name and we can see anything she is building.
Lydia Winters: I’m going to put myself into the build. There I am.
Saxs Persson: Nice. Looks just like you. Let’s use Motion Capture and make your character wave. Try one wave. Let’s try the trick of the double wave. Cool.
Lydia Winters: This looks amazing on the tabletop, but let’s use the stage. Here we go. Look, Mom, I’m in Minecraft.
Saxs Persson: You fit right in with the new people occlusion feature like you can fit right into your Minecraft world. That only works on iOS.