A Metaanalytical Overview of Past AIAA Design, Build, Fly Competitions and Effective Strategies in Engineering Management, Design, Manufacture, and Testing

Dragon FanUndergraduate, School of Mechanical and Manufacturing Engineering, University of New South Wales, AeroEng

Abstract

The upcoming DBF25 competition will be instrumental in establishing UNSW Skylabs DBF as a primary and highly competitive team of bright engineers in aeronautics. This paper aims to provide a (moderately) comprehensive overview of past AIAA Design, Build, Fly competitions, and how learnings can be drawn from past winning teams to maximise chances of success.

Introduction

There are a handful of areas in which UNSW Skylabs DBF can improve to best position ourselves for success in the upcoming DBF25 competition.

Our existing management structure, while sufficient, may benefit from more detail and specificity. In addition, UNSW Skylabs DBF does not currently make use of any formal planning tools, which is a high-yield area for improvement. To better streamline our internal engineering processes, it may also be a good idea to develop a detailed engineering workflow.

To an extent, the mission design process is somewhat formulaic, and hence UNSW Skylabs DBF can draw from past strategies that have proven successful. This includes comprehensive constraint and sensitivity analysis, as well as the use of scripting tools for mission simulation.

Towards the aeronautics/aerodynamics side of things, UNSW Skylabs DBF should consider a more concrete process for the selection of aircraft designs, and a more detailed approach to preliminary analysis and design refinement. Furthermore, for the aerodynamics team, improvements to tooling and software may be beneficial.

The implementation of performance targets could greatly enhance the design process, and in conjunction with comprehensive and detailed testing, could be highly beneficial. UNSW Skylabs DBF should also make some attempt to create and maintain some form of standard operating procedures, which could greatly enhance the efficiency of the team.

Management

Team Organisation

As it stands, our team organisation is as follows.



Existing Organisational Structure


academic

Academic Advisor
Dr. Sonya Brown



projMan

Project Manager
Luke Pan



academic->projMan





aeroLead

Technical Director — Aerodynamics
Heath Lewis



projMan->aeroLead





strucLead

Technical Director — Aeronautical Structures and Manufacturing
Luke Pan



projMan->strucLead





msLead

Technical Director — Avionics and Electrical Systems
Minh Thang Pham



projMan->msLead





engAero

Engineers (Aerodynamics)



aeroLead->engAero




engStruc

Engineers (Aeronautical Structures and Manufacturing)



strucLead->engStruc




engMS

Engineers (Avionics and Electrical Systems)



msLead->engMS



Figure 1. Our existing team organisation at UNSW Skylabs DBF.

This hierarchial chain-of-command is a simple yet effective method to approach engineering management, however, the lack of role specificity risks spiralling into an unproductive state, and/or may lead to confusion and inefficiency. In addition, other high-performing teams tend to include additional logistics and management roles[1][2][3], which may be worth considering.

We propose the following extended management structure.



Proposed Organisational Structure


academic

Academic Advisor



projMan

Project Manager



academic->projMan





aeroLead

Technical Director — Aerodynamics and Analysis



projMan->aeroLead





strucLead

Technical Director — Aeronautical Structures and Manufacturing



projMan->strucLead





msLead

Technical Director — Avionics and Electrical Systems



projMan->msLead





opsDir

Operations Delegate
(AvMC or PAOff)



projMan->opsDir





aeroPrincCFD

Principal Engineer
(Flight Dynamics)



aeroLead->aeroPrincCFD





aeroPrincFEA

Principal Engineer
(Structural Analysis)



aeroLead->aeroPrincFEA





aeroPrincDesign

Principal Engineer
(Aerodynamic Design)



aeroLead->aeroPrincDesign





strucPrincDesn

Principal Engineer
(Structural Design)



strucLead->strucPrincDesn





strucPrincMechDesn

Principal Engineer
(Mechanical Design)



strucLead->strucPrincMechDesn





strucPrincMan

Principal Engineer
(Manufacturing)



strucLead->strucPrincMan





msPrincAv

Principal Engineer
(Avionics / Software)



msLead->msPrincAv





msPrincElec

Principal Engineer
(Electrical Systems)



msLead->msPrincElec





msPrincCirc

Principal Engineer
(Circuit Design and Production)



msLead->msPrincCirc





rpDesignate

Remote Pilot
(Designate)



opsDir->rpDesignate




msoDesignate

Mission Systems Officer
(Designate)



opsDir->msoDesignate




amcReserve

Aviation Mission Controllers
(Reserve)



opsDir->amcReserve




opsPA

Public Affairs Officer



opsDir->opsPA




engAeroSW

Engineers (Analysis)



aeroPrincCFD->engAeroSW




aeroPrincFEA->engAeroSW




engAeroDesn

Engineers (Aerodynamic Design)



aeroPrincDesign->engAeroDesn




engStrucAero

Engineers (Aeronautical Design)



strucPrincDesn->engStrucAero




strucPrincMechDesn->engStrucAero




engStrucMan

Engineers (Manufacturing)



strucPrincMan->engStrucMan




engMSysSW

Engineers
(Embedded Systems / Software)



msPrincAv->engMSysSW




engMSysElec

Engineers (Electrical)



msPrincElec->engMSysElec




msPrincCirc->engMSysElec



Figure 2. Proposed new team organisation at UNSW Skylabs DBF.

This extended structure allows for a more detailed and specific approach to engineering management, and may help to streamline the design and manufacturing process. With more detailed roles, it is recommended, if even encouraged, to take up multiple positions, as this provides appointees a more well-rounded understanding of the project, as well as a more balanced and ground-level perspective. In addition, it provides prospective leaders to shadow their superiors, and develop the necessary skills to take on the roles of their predecessors. As for the proposed operations team, we particularly recommend the inclusion of a public affairs officer. This role is often overlooked, but is critical in maintaining a positive public image, and reinforcing UNSW Skylabs DBF’s reputation as a team of bright, innovative engineers. The operations team also internalises mission controllers, who are responsible for the safe and efficient operation of the aircraft during flight.

Planning

Another key area of improvement for UNSW Skylabs DBF is a concise and distributed approach to time management, which in contrast to other teams[1][2][3], UNSW Skylabs DBF lacks in somewhat. A well-structured plan is essential to the success of the project, and can help to ensure that the project is completed on time and within budget. In particular, it is imperative that UNSW Skylabs DBF focuses on the development of a detailed timeline, which is provided to all team members, and updated regularly. We recommend that a full timeline be developed in the coming weeks, including separately assigned tasks for each engineering team. Specifically, we recommend that key milestones and deliverables be highlighted, and to strictly adhere to the flight testing schedule. A complete/accurate timeline will require some administrative effort, and as such, will require some discussion. Of note, is that GitHub Projects[4] offers a simple (free) method of timeline management, that may be worth investigating as a practical, if rudimentary, approach to timelining.

Furthermore, we recommend the development of an engineering design and manufacturing workflow. It is essential to have a clear understanding of the steps involved in the design and manufacturing process, and that there exists a plan in place to ensure that these steps are completed in a timely manner. This consists of a decision flowchart, which outlines the iterative process of design, manufacture, and testing. If adhered to effectively, this workflow can help to ensure that the project is completed to an acceptable standard in a timely manner.

We propose the following workflow.



Proposed Engineering Workflow


desnSelect

Aircraft Design Selection
Pick winning aircraft design from candidate pool.
Notify team of selection.



prelim

Preliminary Analysis and Design Refinement
Variable Analysis
Performance Targets (Mass, Power, etc.)
Aerodynamic Simulation
Propulsion Planning
Structural Design



desnSelect->prelim





engDesn

Mechanical and Electrical Design
CAD Drafts
Manufacturing Planning
Component Selection
Wiring and Integration Planning



prelim->engDesn





decision

Adequately Addressed Mission and Task Requirements?
Mission Readiness Review
(Final) decision on aircraft suitability.



prelim->decision





manf

Manufacturing
Airframe and Wing Manufacture
Component Manufacture
Composite Manufacture
Assembly Planning
Quality Assurance



engDesn->manf





test

Testing
Component Testing
System Testing
Flight Testing
Mission Simulation
Pilot Feedback
Quality Assurance (Final)



manf->test





test->decision





decision->desnSelect


Fail       
 
 
 



decision->prelim


   Pass
 
 
 



practice

Practice
Circuit Runs
Mission Simulation
Quality Assurance (Last Chance)



decision->practice


 Pass


Figure 3. A new, streamlined engineering workflow for UNSW Skylabs DBF.

Mission Optimisation

Maximising our overall mission score is critical to our success in the DBF25 competition. This comprises two key components— successfully completing missions, and producing a high-quality design report. More specifically, the overall score is determined as follows.

Mx=sreport,%Mx,raw+spM_x = s_\text{report,\%} \cdot M_{x,\text{raw}} + s_p Mtot=M1+M2+M3+MgroundM_\text{tot} = M_1 + M_2 + M_3 + M_\text{ground}

The existing trend for mission design is to comprise three flying missions and one ground mission. For the last three years (at least), the flying circuit has been identical[1][2][3]. There is no reason to suggest that this will change.

DBF Flying Circuit
Figure 4. The DBF22/23/24 flying circuit.

In 2024, the mission theme was humanitarian aid[1].

In 2023, the mission theme was electronic warfare[2].

In 2022, the mission theme was humanitarian aid[3].

With regards to M2 and M3, high-scoring teams found some success in estimating variable relationships, and determining the values that correspond to the maximum score[1]. For example, fM,2=1+1kmt(m)f_{M,2} = 1 + \frac{1}{k} \cdot \frac{m}{t(m)}, with t(m)m3t(m) \approx m^3, would indicate the mass of the supplies should be decreased as much as possible to maximise the score (note: this is not necessarily true, just an example!). An important metric considered by high-scoring teams was to carefully assess how energy consumption affected key variables[1][2][3].

In general, high-performing teams also found success in both constraint analysis and sensitivity analysis.

For constraint analysis, two critical variables are plotted against each other, with regions highlighted based on calculation determined from constraining requirements[1][2][3]. With this, the feasible region can be identified, and then further determine the optimal values for the variables. Note that this is a very simplified explanation, and the actual process is much more complex. Furthermore, in most cases, this type of constraint analysis makes use of numerous assumptions that must be verified to enforce validity.

For sensitivity analysis, the goal is to determine how changes in one variable affect another. This is particularly useful in determining the optimal values for variables, and can be used to determine the most effective strategies for maximising mission score. To measure sensitivity to a change by p[0,1]p \in [0, 1] of a variable, see the following formula. y%=y(x+xp)y(x)y_\% = \frac{y(x + x \cdot p)}{y(x)} (Loosely, with multivariable functions, this is more complex.) With this technique, it is possible to extrapolate/estimate the optimal values for variables, and further develop strategies to maximise mission score.

Finally, though out of our expertise, it is prescient to mention that some teams have found success in mission simulation through scripting tools (i.e. Matlab)[1][2][3].

Aeronautical Design and Aerodynamics

Candidate Aircraft Designs

A critical aspect of succeeding in the DBF competition is to select a highly applicable aircraft design. To achieve this, we propose two key strategies.

  1. Allow / encourage students, internally, or from UNSW Engineering in general, to submit candidate aircraft designs for evaluation and preliminary aerodynamic testing.
  2. Use quality function deployment (QFD) matrices to quantify, if only in estimation, what design and performance metrics UNSW Skylabs DBF should consider, and how they should be valued in importance.

Contrary to common DBF practice[1][2][3], we argue that the implementation of (comparatively) arbitrary metrics for aircraft selection is not the most effective strategy. Instead, we propose the qualitative assessment of candidate aircraft designs, and the use of a QFD matrix to weigh design factors. In short, UNSW Skylabs DBF should consider the aircraft design as a whole, and not just the individual components, in effect— “The sum of the parts is greater than the whole”. This way, it is not only ensured that there is an adequate design pool to select from, but also that the team can select the most appropriate design as per the needs of the competition, which enables maximum flexibility in the engineering workflow, and reserves the ability to pivot if necessary. The use of QFD matrices is commonplace amongst other teams[1][2][3], however, we believe its implementation is only as effective as the quality of the data input. Hence we recommend that UNSW Skylabs DBF take the time to carefully consider the design factors, notably, to use qualitatively determined, high-performing aircraft designs as a benchmark for our own selection process and QFD weights. Importantly, creating an effective/useful QFD matrix is not a trivial task, and requires a significant amount of time and effort, in conjunction with existing engineering work, to develop, thus, we will not provide a proposed matrix here. Finally, for lack of a better place to mention, UNSW Skylabs DBF should strongly consider Solidworks as our primary CAD software, as it is widely used in the aerospace industry, and is a powerful tool for aircraft design, able to provide complete engineering diagrams, and even (to some extent) simulate the performance of the aircraft.

Aerofoil Design

In terms of aerofoil design, we recommend the selection of an existing design from the NACA Aerofoil Database. This is a common practice amongst high-performing teams, and is a simple yet effective method to ensure that aircraft have a high-quality aerofoil design. Additionally, this allows the team to focus on other aspects of the aircraft design, and ensures a reliable and well-tested aerofoil design.

Furthermore, to effectively compare aerofoil designs, we encourage not only drawing from existing literature, but also to conduct our own analysis. Primarily, we recommend the use of industry-standard software such as XFLR5[5] and AVL[6] to predict the static and dynamic stability of different aerofoil designs. In particular, XFLR5 is even tacitly endorsed by AIAA as detailed in at least one official publication[7]. Whilst UNSW Skylabs DBF currently makes use of XFLR5, we believe it is in the team's interest to make use of AVL as well. AVL is a tool developed by the Massachusetts Institute of Technology, and is widely used in the aerospace industry for the analysis of aircraft designs. It is a powerful tool that can be used to predict the static and dynamic stability of different aerofoil designs, and can even help calculate control surface trimming. Some teams have also found success in integrating OpenVSP[8] into their analysis stack, and so it may be worth investigating this software as a potential tool for our own use (though it, in and of itself, is not an analysis tool).

In terms of aerodynamic optimisation, well-placing teams have found a level of success in analysing per-component drag, and in so doing, identifying high-drag areas of the aircraft, and then refining aerodynamic design in these areas to reduce drag.

We find it prescient to also enforce the importance of using aerodynamic simulation data to predict handling characteristics of the aircraft. It would be, unfortunate, to say the least, if UNSW Skylabs DBF were to design an aircraft that was unstable or difficult to control, revealed only during flight testing.

At the end of the day, aerofoil selection is up to the aerodynamics team, however, we recommend that the team consider that our aircraft, equipped with the aerofoil, should be able to remain adequately lift-producing in our low-speed subsonic flight regime, and should be able to maintain stability throughout flight.

Statistics and Testing

Limits and Performance Targets

To ensure that design, manufacture, and testing remains within expectation, we recommend the establishment of performance targets and limits. This enables the team to ensure that the aircraft design remains within acceptable limits, and that the team are able to identify any potential issues early in the design process. Firstly, we propose the introduction of a power consumption limit, and a corresponding power distribution target. Considering the 100 Wh100 \space \text{Wh} hard limit, careful consideration must be given to ensuring our power capacity is capable of meeting mission demands. To help put this into perspective, our Raspberry Pi 5 will run at a power draw ceiling of 5 V5 A=25 W5 \space \text{V} \cdot 5 \space \text{A} = 25 \space \text{W}. Secondly we propose the introduction of a weight limit, and a corresponding weight distribution target. This should be estimated/determined in correspondence with the aerodynamics team.

Testing

In terms of stress testing, we recommend the use strain gauges to measure the strain on the aircraft during flight. With adequate setup, this is a relatively straightforward process, and can provide valuable data on the performance of the aircraft. In addition, though rudimentary, both drop testing and additive mass testing are recommended to ensure that the aircraft is capable of withstanding the forces of flight. Through drop testing, the team can evaluate the landing force of the aircraft, and ensure that it is capable of landing safely. Additive mass testing, on the other hand, allows the team to iteratively add mass to critical points on the aircraft, and determine the effect of this mass on the structure of the aircraft. For accurate preliminary thrust data, particularly to pass to the aerodynamics team, we also recommend the use of static thrust tests.

For flight testing, it is of utmost importance that we collect and iterate on pilot feedback. As such, we recommend the inclusion of a standardised feedback report for pilots to complete after each flight, including a Cooper–Harper rating scale[9] to evaluate the overall handling characteristics of the aircraft. In addition, data collected during flight testing should be meaningful to verify aerodynamic simulation data and predictions.

Preparation Checklists

To maximise assembly and operational efficiency, we suggest that the team develop a series of preparation checklists to clarify standard operating procedures. Commonplace amongst well-prepared teams[1][2][3], these checklists can help to ensure that the aircraft is assembled correctly, and that the aircraft is airworthy before flight. In addition, checklists are useful to ensure that the aircraft is adequately disassembled and stored after flight.

We recommend the inclusion of the following checklists.

Assembly
Damage Inspection: Pass
Attachment and Mounting: Pass
Control Surface Actuation Check: Pass
⇒ All Pass → Ok
⇒ Fail → Reassess, take further action.

Avionics
Damage Inspection: Pass
Attachment and Mounting: Pass
Component Function: Pass
Battery Capacity Check: Pass
Communications Check: Pass
Software Verify: Pass
⇒ All Pass → Ok
⇒ Fail → Reassess, take further action.

Aircraft Final Verify
Expected Metric: Pass
⇒ All Pass → Ok
⇒ Fail → Reassess, take further action.

Preflight Checklist
Safety: Pass
Systems Functional: Pass
Responsive to Input: Pass
⇒ All Pass → Ok
⇒ Fail → Reassess, take further action.

Landing Checklist
Safety: Pass
Systems Functional: Pass
Responsive to Input: Pass
⇒ All Pass → Ok
⇒ Fail → Reassess, take further action.

Shutdown Checklist
As Expected: Pass
⇒ All Pass → Ok
⇒ Fail → Reassess, take further action.

References

1.  2024 DBF Top 3 Reports. (n.d.). [online] Available at: https://www.aiaa.org/docs/default-source/uploadedfiles/aiaadbf/previous-competitions/2024-dbf-top-3-reports.pdf.

2.  2023 DBF Top 3 Reports. (n.d.). [online] Available at: https://www.aiaa.org/docs/default-source/uploadedfiles/aiaadbf/2023-dbf-competition-top-3-reports.pdf.

3.  2022 DBF Top 3 Reports. (n.d.). [online] Available at: https://www.aiaa.org/docs/default-source/uploadedfiles/aiaadbf/previous-competitions/top-reports/2022-aiaa-dbf-top-3-reportsd97c1337115848488c92228e188607c7.pdf?sfvrsn=ea245bb8_0.

4.  GitHub Docs. (n.d.). About Projects. [online] Available at: https://docs.github.com/en/issues/planning-and-tracking-with-projects/learning-about-projects/about-projects.

5.  Xflr5.tech. (2013). XFLR5. [online] Available at: http://www.xflr5.tech/xflr5.htm.

6.  web.mit.edu. (n.d.). AVL. [online] Available at: https://web.mit.edu/drela/Public/web/avl/.

7.  Millard, J., Booth, S., Rawther, C. and Hayashibara, S. (2022). XFLR5 as a Design Tool in Remotely Controlled Design-Build-Fly Applications. AIAA SCITECH 2022 Forum. doi:https://doi.org/10.2514/6.2022-0003.

8.  openvsp.org. (n.d.). OpenVSP. [online] Available at: https://openvsp.org/.

9.  Wikipedia Contributors (2023). Cooper–Harper rating scale. [online] Wikipedia. Available at: https://en.wikipedia.org/wiki/Cooper%E2%80%93Harper_rating_scale.

References courtesy of MyBib.