- I’m an Artifical Intelligence grad, now a learner at Apple Developer Academy @UNINA Federico II.
- Solving real-world challenges using technology is my thing.
- Artificial Intelligence and Machine Learning - I’ve worked on Apple platform apps that tap into cutting-edge AI/ML frameworks like CreateML, CoreML, and Natural Language.
- Spatial Computing - I have hands-on experience creating spatial apps for Apple platforms with ARKit and RealityKit.
- MLX Swift - A swift-native framework designed to light weight Large Language Models directly on Apple devices.
- Foundation Models - An Apple framework that lets you tap into on-device LLM that powers Apple Intelligence.
- RealityKit - More advanced capabilities of RealityKit.
- Metal - Writing custom shaders using Metal.
- SwiftUI - A declarative framework to build beautiful, reactive user interfaces across Apple platforms.
- SwiftData - Apple’s lightweight, Swift-native framework for local data persistence and model-driven apps.
- Swift Charts - A flexible framework to create beautiful, animated charts and data visualizations in SwiftUI.
- MapKit - A framework for embedding interactive maps and location-based features into your app.
- HealthKit - A secure framework for managing, sharing, and analyzing health and fitness data across Apple devices.
- SceneKit - A 3D graphics framework for building immersive scenes, animations, and visualizations on Apple platforms.
- Core Haptics – Framework for creating rich, dynamic haptic feedback to enhance user interactions.
- AVFoundation – Apple’s comprehensive framework for handling media playback, audio, and speech synthesis in apps.
I’m currently building Distraction Dodge, a focus training app that simulates real-world distractions like notifications and short-form videos to help users improve their attention span. It’s built with ARKit’s eye-tracking on iPadOS and adapted leveraging indirect gestures, for spatial computing in visionOS.
If you understand the problem, that’s half the battle.