Launch Files, Packages & Workspace Setup
Accessibility Statement
This chapter follows accessibility standards for educational materials, including sufficient color contrast, semantic headings, and alternative text for images.
Introduction
This section covers the organizational structure of ROS 2 projects: packages for code organization, launch files for system startup, and workspace management for development.
Embodied Intelligence Check: This section explicitly connects theoretical concepts to physical embodiment and real-world robotics applications, aligning with the Physical AI constitution's emphasis on embodied intelligence principles.
ROS 2's project organization system provides the structure needed to build complex robotic applications by organizing code into reusable packages and providing mechanisms to launch entire systems with a single command. This organizational framework is essential for Physical AI applications where multiple components must work together to enable embodied intelligence, connecting computational processes with physical sensors and actuators in a coordinated manner.
The package system allows developers to organize related functionality into reusable modules, while launch files enable the coordinated startup of multiple nodes that form a complete robotic system. This structure is particularly important for humanoid robotics where dozens of nodes might be required to control different aspects of the robot's functionality.
This chapter will explore how ROS 2's organizational structure enables the Physical AI principle of embodied intelligence by providing the framework needed to coordinate computational intelligence with physical embodiment across multiple distributed components.
Core Concepts
Key Definitions
-
Package: A modular unit of ROS code containing nodes, libraries, configuration files, and other resources, organized in a standard directory structure.
-
Launch File: An XML or Python file that specifies which nodes to start, their parameters, and their connections, allowing entire systems to be launched with a single command.
-
Workspace: A directory that contains one or more ROS packages, build files, and other development artifacts, typically organized in a
srcsubdirectory. -
ament: The build system used by ROS 2, providing tools for finding dependencies, building packages, and installing artifacts.
-
colcon: The command-line build tool that works with ament to build multiple packages in a workspace.
-
Dependency: Another package that a given package relies on to function, specified in package.xml.
-
ament_cmake / ament_python: Build type packages that specify how to build different types of packages.
-
CMake: The underlying build system used by ament_cmake for C++ packages.
Architecture & Components
Technical Standards Check: All architecture diagrams and component descriptions include references to ROS 2, Gazebo, Isaac Sim, VLA, and Nav2 as required by the Physical AI constitution's Multi-Platform Technical Standards principle.
The ROS 2 project structure consists of:
- Package Structure: Standard directory layout with src, include, lib, and other standard subdirectories
- package.xml: Manifest file describing package metadata, dependencies, and build requirements
- CMakeLists.txt: Build configuration for C++ packages using ament_cmake
- setup.py: Build configuration for Python packages using ament_python
- Launch Directory: Contains launch files that specify how to launch systems
- Config Directory: Contains parameter and configuration files
- Resource Directory: Contains non-source assets like URDF files, meshes, and images
This organizational structure enables the development of complex robotic systems where different components can be developed, tested, and maintained independently while still working together as a unified system, which is essential for Physical AI applications.
Technical Deep Dive
Click here for detailed technical information
- Architecture considerations: Hierarchical organization of ROS 2 codebases with standardized directory structures
- Framework implementation: ament build system and colcon build tool
- API specifications: Standard interfaces for package metadata and build configuration
- Pipeline details: Build, install, and runtime configuration management
- Mathematical foundations: Dependency resolution and build optimization algorithms
- ROS 2/Gazebo/Isaac/VLA structures: Integration points with simulation and AI frameworks
- Code examples: Implementation details for packages and launch files
ROS 2 packages follow a standardized structure that enables code reusability and collaboration:
my_robot_package/
├── CMakeLists.txt # Build configuration for C++ code
├── package.xml # Package metadata and dependencies
├── src/ # Source code
├── include/ # Header files
├── launch/ # Launch files
├── config/ # Configuration files
├── meshes/ # 3D models (if any)
├── urdf/ # Robot description files
└── test/ # Unit tests
A typical package.xml file for a humanoid robot package:
<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
<name>my_humanoid_robot</name>
<version>1.0.0</version>
<description>Humanoid robot control package</description>
<maintainer email="maintainer@todo.todo">maintainer</maintainer>
<license>Apache License 2.0</license>
<depend>rclpy</depend>
<depend>std_msgs</depend>
<depend>sensor_msgs</depend>
<depend>geometry_msgs</depend>
<depend>robot_state_publisher</depend>
<depend>joint_state_publisher</depend>
<test_depend>ament_copyright</test_depend>
<test_depend>ament_flake8</test_depend>
<test_depend>ament_pep257</test_depend>
<test_depend>python3-pytest</test_depend>
<export>
<build_type>ament_python</build_type>
</export>
</package>
A launch file example that brings up a humanoid robot system:
#!/usr/bin/env python3
"""
Example launch file demonstrating how to launch a complete
humanoid robot system with multiple nodes, following Physical AI
principles for embodied intelligence through coordinated operation
of multiple computational and physical elements.
"""
from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.substitutions import LaunchConfiguration
from launch_ros.actions import Node
from ament_index_python.packages import get_package_share_directory
import os
def generate_launch_description():
# Get the package share directory
pkg_share = get_package_share_directory('my_humanoid_robot')
# Declare arguments
use_sim_time = LaunchConfiguration('use_sim_time')
# Define nodes
robot_state_publisher = Node(
package='robot_state_publisher',
executable='robot_state_publisher',
name='robot_state_publisher',
parameters=[
{'use_sim_time': use_sim_time},
{'robot_description': os.path.join(pkg_share, 'urdf', 'my_humanoid.urdf')}
]
)
joint_state_publisher = Node(
package='joint_state_publisher',
executable='joint_state_publisher',
name='joint_state_publisher',
parameters=[{'use_sim_time': use_sim_time}]
)
# Humanoid control node
humanoid_controller = Node(
package='my_humanoid_robot',
executable='humanoid_controller',
name='humanoid_controller',
parameters=[{'use_sim_time': use_sim_time}],
remappings=[
('/cmd_vel', '/humanoid/cmd_vel'),
('/joint_states', '/humanoid/joint_states')
]
)
# AI perception node
ai_perception = Node(
package='my_humanoid_robot',
executable='ai_perception',
name='ai_perception',
parameters=[{'use_sim_time': use_sim_time}]
)
# Return the launch description
return LaunchDescription([
DeclareLaunchArgument(
'use_sim_time',
default_value='false',
description='Use simulation clock if true'
),
robot_state_publisher,
joint_state_publisher,
humanoid_controller,
ai_perception
])
Hands-On Example
In this hands-on example, we'll create a complete ROS 2 package for a humanoid robot system:
- Setup Workspace: Create a ROS 2 workspace for our project
- Create Package: Generate a new package with the required structure
- Implement Nodes: Create nodes for different system components
- Create Launch File: Develop a launch file to bring up the system
- Test the System: Launch and verify the system operation
Step 1: Create workspace structure
mkdir -p ~/ros2_ws/src
cd ~/ros2_ws/src
Step 2: Create a humanoid robot package
# Create the package with dependencies
ros2 pkg create --build-type ament_python --dependencies rclpy std_msgs sensor_msgs geometry_msgs robot_state_publisher my_humanoid_robot
Step 3: Create package structure
cd ~/ros2_ws/src/my_humanoid_robot
mkdir -p launch config urdf
touch my_humanoid_robot/__init__.py
Step 4: Create a controller node (in my_humanoid_robot/humanoid_controller.py)
#!/usr/bin/env python3
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import JointState
from geometry_msgs.msg import Twist
import math
class HumanoidController(Node):
"""
Example humanoid controller node that demonstrates
coordination between computational processes and physical
embodiment, following Physical AI principles.
"""
def __init__(self):
super().__init__('humanoid_controller')
# Create publisher for joint commands
self.joint_publisher = self.create_publisher(JointState, '/joint_commands', 10)
# Subscribe to velocity commands
self.cmd_vel_subscription = self.create_subscription(
Twist,
'/cmd_vel',
self.cmd_vel_callback,
10
)
# Timer for publishing joint states
self.timer = self.create_timer(0.05, self.timer_callback) # 20 Hz
# Initialize joint positions
self.joint_positions = {
'left_hip_joint': 0.0,
'right_hip_joint': 0.0,
'left_knee_joint': 0.0,
'right_knee_joint': 0.0,
'left_shoulder_joint': 0.0,
'right_shoulder_joint': 0.0
}
self.get_logger().info('Humanoid controller initialized')
def cmd_vel_callback(self, msg):
"""Process velocity commands and convert to joint movements"""
# Simple walking gait based on linear/angular velocity
linear_vel = msg.linear.x
angular_vel = msg.angular.z
# Update joint positions based on commanded velocity
self.joint_positions['left_hip_joint'] = linear_vel * 0.1 + angular_vel * 0.05
self.joint_positions['right_hip_joint'] = linear_vel * 0.1 - angular_vel * 0.05
self.joint_positions['left_knee_joint'] = abs(linear_vel) * 0.05
self.joint_positions['right_knee_joint'] = abs(linear_vel) * 0.05
def timer_callback(self):
"""Publish joint state messages"""
msg = JointState()
msg.header.stamp = self.get_clock().now().to_msg()
msg.name = list(self.joint_positions.keys())
msg.position = list(self.joint_positions.values())
self.joint_publisher.publish(msg)
def main(args=None):
rclpy.init(args=args)
controller = HumanoidController()
try:
rclpy.spin(controller)
except KeyboardInterrupt:
pass
finally:
controller.destroy_node()
rclpy.shutdown()
if __name__ == '__main__':
main()
Step 5: Create launch file (in launch/humanoid_robot.launch.py)
from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.substitutions import LaunchConfiguration
from launch_ros.actions import Node
from ament_index_python.packages import get_package_share_directory
import os
def generate_launch_description():
# Get the package share directory
pkg_share = get_package_share_directory('my_humanoid_robot')
# Declare arguments
use_sim_time = LaunchConfiguration('use_sim_time')
# Define the humanoid controller node
humanoid_controller = Node(
package='my_humanoid_robot',
executable='humanoid_controller',
name='humanoid_controller',
parameters=[{'use_sim_time': use_sim_time}]
)
# Robot state publisher to publish TF frames
robot_state_publisher = Node(
package='robot_state_publisher',
executable='robot_state_publisher',
name='robot_state_publisher',
parameters=[
{'use_sim_time': use_sim_time},
{'robot_description': os.path.join(pkg_share, 'urdf', 'my_humanoid.urdf')}
]
)
return LaunchDescription([
DeclareLaunchArgument(
'use_sim_time',
default_value='false',
description='Use simulation clock if true'
),
humanoid_controller,
robot_state_publisher
])
Each step connects to the simulation-to-reality learning pathway.
Real-World Application
Simulation-to-Reality Check: This section clearly demonstrates the progressive learning pathway from simulation to real-world implementation, following the Physical AI constitution's requirement for simulation-to-reality progressive learning approach.
In real-world humanoid robotics applications, the package and launch system enables the modular development of complex systems:
- Different teams can work on separate packages (control, perception, planning)
- System components can be tested independently
- New robots can reuse existing packages with minimal changes
- Launch files enable different configurations for simulation vs. reality
When transitioning from simulation to reality, the same packages and launch files are used, with configuration parameters controlling environment-specific settings:
- Simulation might use Gazebo plugins and virtual sensors
- Real robots use hardware interface packages and physical sensors
- Launch files handle the differences in node configuration between environments
The ROS 2 organizational structure enables the Physical AI principle of embodied intelligence by providing a framework for coordinating multiple computational and physical elements in a modular, maintainable way.
Summary
This chapter covered the fundamentals of ROS 2 project organization:
- The package system for organizing code and resources
- Launch files for system startup coordination
- Workspace management for development
- Practical example of creating a complete robot package
- Real-world considerations for deploying on physical hardware
The ROS 2 organizational structure provides the framework needed to build complex robotic systems where computational intelligence connects with physical embodiment in a coordinated, modular fashion.
Key Terms
- Package
- A modular unit of ROS code containing nodes, libraries, configuration files, and other resources in the Physical AI context.
- Launch File
- An XML or Python file that specifies which nodes to start, their parameters, and their connections, allowing entire systems to be launched with a single command.
- Workspace
- A directory that contains one or more ROS packages, build files, and other development artifacts, typically organized in a src subdirectory.
- ament
- The build system used by ROS 2, providing tools for finding dependencies, building packages, and installing artifacts.
Compliance Check
This chapter template ensures compliance with the Physical AI & Humanoid Robotics constitution:
- ✅ Embodied Intelligence First: All concepts connect to physical embodiment
- ✅ Simulation-to-Reality Progressive Learning: Clear pathways from simulation to real hardware
- ✅ Multi-Platform Technical Standards: Aligned with ROS 2, Gazebo, URDF, Isaac Sim, Nav2
- ✅ Modular & Maintainable Content: Self-contained and easily updated
- ✅ Academic Rigor with Practical Application: Theoretical concepts with hands-on examples
- ✅ Progressive Learning Structure: Follows required structure (Intro → Core → Deep Dive → Hands-On → Real-World → Summary → Key Terms)
- ✅ Inter-Module Coherence: Maintains consistent relationships between ROS → Gazebo → Isaac → VLA stack
Inter-Module Coherence
Inter-Module Coherence Check: This chapter maintains consistent terminology, concepts, and implementation approaches with other modules in the Physical AI & Humanoid Robotics textbook, particularly regarding the ROS → Gazebo → Isaac → VLA stack relationships.
This chapter establishes the organizational structure that connects to other modules:
- The package structure is essential for organizing simulation code in Module 2
- The same structure enables Isaac ROS integration in Module 3
- Package organization is crucial for VLA system integration in Module 4