
This builds BOOT.BIN from an XSA file and fetched sources.  It
requires cross-compilers to be installed, etc.  Unfortunately, all
dependencies aren't documented.  All that aren't included are
installable from Unbuntu "apt get".

Input XSA files are put into inputs/XSA.  These are generated in
Vivado, using "export hardware" from the file menu.

The build process here does not include any PL devices in the Linux
device tree.  This is so that Linux doesn't try to access them, which
typically causes the ARM to hang if the PL isn't configured.  The
normal operating mode of STYNQ is to boot to Linux first, *then* later
load the PL from Linux.

The main input is in "inputs/XSA/{BOARD}.xsa".  If multiple XSA's a
present, multiple boot files will be built for each XSA.  Some steps
have additional inputs, which are in the "inputs" directory for that
step.  No board-specific inputs are required in these directories to
boot, although somtimes some board-specific files can be included to
give additional cosmetic boot enhancements.

Note that BOARD is often of the form ZCU111 or RFSoC4x2, but sometimes
with the same physical board there are multiple PS configurations in
different Vivado designs (especially the PS-to-PL interface), so a
single physical board may have multiple {BOARD}.xsa files to reflect
that.  The XSA used should be from the Vivado build of your design,
and the PS processor configuration in that design should match your
board's schematic.  This ensures that the right PS devices are enabled
and they have the correct configuration.

If you have multiple Vivado designs with the same PS configuration,
you can use the same boot files for all designs.  With STYNQ, PL
differences do not go into the Linux kernel device tree, so only PS
differences require new boot files.

Drivers for PL devices are discussed separately with many examples.
PL device drivers with STYNQ do not go into the Linux kernel device
tree, but are communicagted directly from Vivado's XSA file to
software.  Since this README is in the boot directory, it's the wrong
place to discuss this further.

The build process builds BOOT.BIN, image.ub, and BOOT.SCR.  Building
these files requires many components and many steps.  To the extent
possible, each step is kept separate in STYNQ, and the code to do that
step is kept as simple and transparent as possible.  This
distinguishes STYNQ from other similar projects, such as PYNQ,
PetaLinux, and YOCTO.  To be fair, YOCTO solves a much larger problem
than just booting Xilinx boards, which is good.  However, the simipler
approach with STYNQ gives more transparency and simpler control.
These are perhaps the principle tradeoffs.

Each component is built in a separate directory.  Output items that
depend on the XSA have {BOARD} in them.  These are listed in the order
in which they are built.

Build this first with its own build script:

  ../kernel (build.sh)             ../kernel/out/Image.gz

These are all invoked one at a time from the build.sh script in this
directory:
                                   OUTPUTS
  FSBL            (build.sh)  FSBL/out/{BOARD}_FSBL.elf
  U-BOOT       (exebuild.sh)  In U-BOOT/exe/: dtc, dumpimage, mkimage
  device-tree-gen (build.sh)  In device-tree-gen/out/:  {BOARD}.dts, {BOARD}.dtb
  U-BOOT          (build.sh)  U-BOOT/out/{BOARD}_U_BOOT.elf
  PMUFW           (build.sh)  PMUFW/out/{BOARD}_PMUFW.elf
  BL31            (build.sh)  BL31/out/bl31.elf
  bootgen         (build.sh)  bootgen/bootgen/bootgen
  BOOT_BIN        (build.sh)  BOOT_BIN/out/{BOARD}_BOOT.BIN
  BOOT_SCR        (build.sh)  BOOT_SCR/out/BOOT.SCR
  IMAGE_UB        (build.sh)  {BOARD}_image.ub

Finally, the three output files are put into out/{BOARD}, as BOOT.BIN,
BOOT.SCR, and image.ub.  These are copied onto the first
vfat-formatted partition of a microSD card to boot.  The second
partition is formatted to EXT4, containing a Linux filesystem that
will be booted into.  This Linux filesystem is created by the
tools in the ../filesystem directory.

The top-level build.sh runs all required steps to create these boot
files for every input XSA.

When you first run build.sh to build your boot files with a new XSA,
you will need the Xilinx tool xsct, part of Vitis.  Most importantly,
this configures source code that becomes part of the PMU Firmware, and
performs initialization of PS devices.  (For example, the PSMGTs that
are needed for DisplayPort, USB, and PCIe.)

At the time of the initial release of STYNQ, Vitis 2024.2 was used.

However, after the first run the outputs that require Vitis are
cached.  So subsequent builds of the boot files don't require Vitis.
This streamlines the build when kernel modifications are being tested.

Currently the build is set up for cross-compilation on an X86 machine,
tested with Ubuntu 22.04.1.

The top-level veryclean.sh deletes all intermediate files for
archiving, except those cached from Vitis runs.  To rebuild those,
delete them in their "out" directories with fullclean.sh.

Most steps require projects from github to build.  These projects are
saved locally in gzipped tar files.  This allows the exact version
used for testing to be used again.  If you delete any of these tgz
files and run the build.sh script, they will be fetched again from
github, perhaps in a newer version.  Check the build.sh scripts
for details.

Good Luck!
I hope you like STYNQ.

Ross Martin
Bit by Bit Signal Processing, LLC


