Although XR headsets have become increasingly prevalent, their integration with mobile phones remains limited. Existing approaches typically replicate phone interfaces in XR or employ the phone as a 6DoF controller. We propose a framework that enables seamless transitions between mirrored, magnified, and augmented views of mobile interfaces in XR. Our system dynamically adapts these interfaces to application content, creating a generalizable solution for phone–XR integration. Guided by literature reviews and expert workshops, we established a design space and developed a prototype evaluated in real-world applications. A user study evaluated the system, underscoring its potential to foster a cohesive cross-device ecosystem.