Skip to main content

Apple Intelligence announced at WWDC

At its annual developer conference, Apple WWDC, Apple unveiled its new AI platform, Apple Intelligence, which will be integrated across iPhone, iPad, and Mac.

“At Apple, it’s always been our goal to design powerful personal products that enrich people’s lives by enabling them to do the things that matter most as simply and as easily as possible,” said Tim Cook, CEO of Apple, during the livestream. “We’ve been using artificial intelligence and machine learning for years to help us further that goal. Recent developments in generative intelligence and large language models offer powerful capabilities that provide the opportunity to take the experience of using Apple products to new heights.”

According to Craig Federighi, senior vice president of software engineering at Apple, the goal of Apple Intelligence is to combine the power of generative models with personalization based on Apple’s knowledge of a user. 

“It draws on your personal context, to give you intelligence that’s most helpful and relevant to you,” he said. 

Apple Intelligence offers multimodal generative capabilities, meaning it can generate text and images. For instance, it can use information about your contacts to create personalized images, such as generating an image of your friend in front of a birthday cake to send in a message wishing them a happy birthday. 

It can also help you improve your writing by either proofreading or rewriting what you have. When rewriting, you have the option to tell it to rewrite it to be more friendly, professional, or concise. 

These new generative capabilities are available across Apple apps like Mail, Notes, Safari, Pages, and Keynote. Third-party developers can also build capabilities into their apps by using the App Intents framework. 

Beyond its generative capabilities, the platform can also carry out specific tasks for you. Examples Federighi gave of this in action include asking it to “Pull up the files that Joz shared with me last week,” “Show me all the photos of Mom, Olivia, and me,” or “Play the podcast that my wife sent the other day.”    

During the event, Federighi kept highlighting that what makes Apple Intelligence so special is the ability to draw on personal context. Another example he gave of this in action is an employee who just got an email that a meeting was rescheduled to later in the afternoon. He wants to know if he can go to this meeting and still make it to his daughter’s play that evening. Apple Intelligence draws on its knowledge of who his daughter is, the play details she sent a few days ago, and predicted traffic between the office and the theater at the time he’d be leaving to provide an answer.  

“Understanding this kind of personal context is essential for delivering truly helpful intelligence, but it has to be done right,” he said. “You should not have to hand over all the details of your life to be warehoused and analyzed in someone’s AI cloud.”

According to Federighi, Apple Intelligence features on-device processing, “so that it’s aware of your personal data without collecting your personal data.” He explained that this is possible because of the advanced computing power of its Apple silicon processors (A17 and the M family of chips). 

Behind-the-scenes, it creates a semantic index about you on the device, and then consults that semantic index when a query is made. 

There are some instances where the on-device processing may not be enough and data needs to be processed on a server. To allow for this without compromising privacy, Apple announced Private Cloud Compute. When a request is made, the device will determine what can be handled on device and what needs to be sent to Private Cloud Compute. According to Apple, the data sent to Private Cloud Compute is not stored or accessible to Apple; it is used only to complete the request.  

“This sets a brand-new standard for privacy in AI and unlocks intelligence you can trust,” he said.

Siri is being updated to take advantage of the capabilities of Apple Intelligence. The assistant now has better language understanding, allowing it to still be able to understand you even if you’re not being clear or are stumbling over your words. It will also now maintain conversational context, allowing you to follow up with additional questions or requests about your query. Another update to Siri is the ability to type requests rather than having to say the request out loud.

Siri has also been visually updated; when active, a glowing ring will display around the edge of the screen.

And finally, ChatGPT has been integrated into Siri and Writing Tools. Users can control when ChatGPT is used and will be required to confirm that it’s okay to share information with ChatGPT. 


You may also like…

The post Apple Intelligence announced at WWDC appeared first on SD Times.



from SD Times https://ift.tt/CyjPUp1

Comments

Popular posts from this blog

Difference between Web Designer and Web Developer Neeraj Mishra The Crazy Programmer

Have you ever wondered about the distinctions between web developers’ and web designers’ duties and obligations? You’re not alone! Many people have trouble distinguishing between these two. Although they collaborate to publish new websites on the internet, web developers and web designers play very different roles. To put these job possibilities into perspective, consider the construction of a house. To create a vision for the house, including the visual components, the space planning and layout, the materials, and the overall appearance and sense of the space, you need an architect. That said, to translate an idea into a building, you need construction professionals to take those architectural drawings and put them into practice. Image Source In a similar vein, web development and design work together to create websites. Let’s examine the major responsibilities and distinctions between web developers and web designers. Let’s get going, shall we? What Does a Web Designer Do?

A guide to data integration tools

CData Software is a leader in data access and connectivity solutions. It specializes in the development of data drivers and data access technologies for real-time access to online or on-premise applications, databases and web APIs. The company is focused on bringing data connectivity capabilities natively into tools organizations already use. It also features ETL/ELT solutions, enterprise connectors, and data visualization. Matillion ’s data transformation software empowers customers to extract data from a wide number of sources, load it into their chosen cloud data warehouse (CDW) and transform that data from its siloed source state, into analytics-ready insights – prepared for advanced analytics, machine learning, and artificial intelligence use cases. Only Matillion is purpose-built for Snowflake, Amazon Redshift, Google BigQuery, and Microsoft Azure, enabling businesses to achieve new levels of simplicity, speed, scale, and savings. Trusted by companies of all sizes to meet

2022: The year of hybrid work

Remote work was once considered a luxury to many, but in 2020, it became a necessity for a large portion of the workforce, as the scary and unknown COVID-19 virus sickened and even took the lives of so many people around the world.  Some workers were able to thrive in a remote setting, while others felt isolated and struggled to keep up a balance between their work and home lives. Last year saw the availability of life-saving vaccines, so companies were able to start having the conversation about what to do next. Should they keep everyone remote? Should they go back to working in the office full time? Or should they do something in between? Enter hybrid work, which offers a mix of the two. A Fall 2021 study conducted by Google revealed that over 75% of survey respondents expect hybrid work to become a standard practice within their organization within the next three years.  Thus, two years after the world abruptly shifted to widespread adoption of remote work, we are declaring 20