A Model to Lock-in Gains When Trading Single-stock ETFs

A prior post titled “What are Single-stock ETFs and Should I Invest in Them?” introduced readers of this blog to leveraged single-stock ETFs.  The focus of the prior post was leveraged ETFs that return up to two times the daily return of their underlying tickers.

This ratio of a two-to-one daily return for a single-stock ETF applies to both increases and decreases in the underlying stock.  Single‑stock ETFs are most commonly offered for tickers that experience high trading volume, strong retail interest, and episodes of fast‑paced price growth.  This practice entices traders/investors to buy single-stock ETFs, even though there is the risk of losing twice as much on a single trading day as with the underlying security.  In addition, there are other considerations that can negatively impact the prices of single-stock ETFs, such as daily rebalancing costs and volatility drag.

  • Rebalancing costs are primarily for daily swap financing and rebalancing slippage.  These two items represent the cost of doing business as a single-stock ETF.  Rebalancing slippage happens because the ETF must buy after up days and sell after down days to reset its leverage, and those trades rarely occur at optimal prices.  Daily swap financing is the cost the ETF pays its swap counterparty for “renting” leveraged exposure to the underlying stock each day.

  • Volatility drag can occur because a ten percent loss on one trading day followed by a ten percent gain on the next trading day will not allow the trader/investor to break even.  This is because you need to earn more on the second trading day to reach breakeven.


This post aims to expose you to an analytical framework that will keep you out of single-stock leveraged ETFs until they are likely to enter a period of explosive growth and to retain most of the gains during that period.

Highlights from This Blog’s Initial Single-stock ETF Post

Many stock market analysts recommend investors who prefer a buy-and-hold strategy to avoid single-stock ETFs.  Some analysts even recommend that traders avoid single-stock ETFs because of the difficulty of estimating when an underlying stock will be rising, and the performance drags for holding a single-stock ETF when it is declining or even remaining relatively flat over successive trading days.

Instead of relying on assumptions about how single-stock ETFs could perform, this blog’s initial post for single-stock ETFs empirically tested returns for ten leveraged securities.

  • The percentage change across all ten underlying securities was 80.65%.  In contrast, the percentage change across the top five underlying securities was 109.78.  This reveals a 29.13% change advantage for the top five securities relative to all ten underlying securities.

  • Furthermore, the percentage change across all ten leveraged ETFs was 109.78% compared to 214.12% for the top five leveraged ETFs.  Consequently, the top five advantage was 104.34% for the leveraged ETFs versus their underlying securities.

  • The percentage advantage for the top five was more than 2.5 larger for leveraged ETFs than for their underlying securities!  Leveraged ETFs when implemented with single-stock ETFs can have a huge impact on performance.

A Project Design to Lock in Gains for Leveraged ETFs

This second post on single-stock ETFs adapts a trading model for discovering dates when an underlying security is likely to grow.  Then, the strategy uses those same dates to assess how much the leveraged ETFs grow over the same start and end dates for successive trades.  This is a much more powerful approach than in the first single-stock ETF blog post because a model picks the entry/exit trade dates for underlying securities of a single-stock ETF.  Then, those same entry/exit trade dates are used to determine buy and sell dates for trades with the leveraged ETF.

The following diagram shows a conceptual overview of the process for collecting and analyzing historical prices for 8 tickers tracked in this blog post.  Notice that it starts with LSEG Data Analytics.



Each Excel worksheet tab stores historical price data for a separate ticker.  The “How to Populate a Trading Database with Refinitiv, Excel, and SQL Server—Update 2” post demonstrates the process for populating the worksheets, the steps for copying worksheet contents to csv files, and a T-SQL script for importing the csv files into a tall SQL Server table (dbo.stockhistory_clean).  The tall SQL Server table has columns for the date, open, and close values for each ticker.  The following screenshot displays the ticker, row count (number of trading days), along with the first and last trading days for each ticker analyzed in this post.



The AMZN, MU, NVDA, AND PLTR tickers are the underlying securities for the single-stock ETFs.  The AMZU, MUU, NVDL, and PTIR tickers designate the actual single-stock leveraged ETFs.

Another T-SQL script (available in the Code Appendix to this post) computes two EMA series with period lengths of 21 and 63 trading days.  The EMA values are stored in the right two columns of the an updated version of the dbo.stockhistory_clean table.  The name for the updated table is dbo.stockhistory_with_emas.

The EMA column values are required by the Proper Order and Gain Lock-in (POGL) model.  The code for an early version of the POGL model is available in the “A Preliminary Analysis of EMA Period Lengths and Price Action in a Buy-Sell Model” post.

T-SQL Code for the POGL Model

The full T-SQL script for the POGL model implementation is available in the Code Appendix of this post.  This model aims to identify profitable entry and exit dates based on the price action of an underlying ticker.  The script then applies those specific dates to calculate performance for a corresponding leveraged ETF.

To accomplish the preceding tasks, the script utilizes a temporary table populated with data from dbo.stockhistory_with_emas. This allows the T-SQL script to find the ETF’s price at the exact moment the underlying security triggers a signal. The difference between the entry and exit prices reveals the return for each successive ETF trade.

The script’s logic is divided into three primary functional phases, followed by a reporting section that generates the final backtesting results.

Phase 1: Mapping and Workspace Preparation

The process begins by establishing a #ticker_map to link underlying securities with their corresponding leveraged ETFs (for example, matching NVDA with NVDL). The script also initializes a results table to store the details of every trade. This structure is essential because it allows the model to process a watchlist of multiple ticker pairs in a single execution.

Phase 2: Trade Initiation and Boundary Traversal

The core script logic iterates through the historical data of the underlying ticker to manage the lifecycle of each trade:

  • Initiating Trades (Proper Order): The model identifies an entry point when the underlying security enters "proper order," which occurs when the close price rises above the 21-day EMA, provided the 21-day EMA is also above the 63-day EMA anchor.
  • Traversing Boundaries (Gain Lock-in): Once a trade is active, the script establishes upper and lower boundary values. When the close price reaches or exceeds the upper boundary value, both the upper and lower boundary values are adjusted upward. As the price traverses successively higher upper boundary values (rungs), the model "ratchets" up the stop-loss floor (lower boundary value). This manages the trade by locking in progress as the price moves up and to the right.
  • Exiting the Trade: A trade icon includes when the underlying price crosses below the current floor (the lower boundary value).

Phase 3: Synchronizing Underlying Dates to ETF Performance

A key feature of this implementation is the synchronization of dates.  By copying the entry and exit dates derived from the underlying security and applying them directly to the ETF data, the script can compute the gains for a leveraged single-stock ETF.  This ensures the ETF performance is based on the specific trend windows identified for its corresponding underlying ticker.

Phase 4: Final Analytics and Result Sets

Finally, the script generates two distinct views to evaluate the strategy:

  • The Trades Table: A chronological log of every individual trade, displaying the entry/exit dates and the number of rungs (boundaries) successfully traversed.
  • The Summary Table: An aggregate view that compares the performance of the underlying ticker against the ETF across the entire testing period. This table highlights the total "Gain Lock-ins" and the final compounded return, which represent two different approaches for assessing the growth of the trading account.

Review Result Sets

The following screenshot compares the underlying tickers (MU, NVDA, PLTR, and AMZN) to their corresponding leveraged ETF tickers (MUU, NVDL, PTIR, and AMZU) over the lifespan of the leveraged ETFs from inception through May 24, 2026, the last trading day for which historical prices were collected for this post.  While the ending date is always May 24, 2026, the beginning trading day for any pair of matching tickers depends on when the leveraged ETF of any pair initially became available for public trading.  Because the inception date varies for the ETF ticker in each pair, the evaluation varies from one ticker pair to the next.

There are also two different metrics for comparing performance.

  • The Total Summed Percentage Change represents the sum of the percentage changes across all trades for that ticker.  This metric tracks the percentage return from each trade individually.  In other words, a poor performance for one trade impacts the metric’s value for just that trade.

  • The Compounded Percentage Change is based on the product of the percentage changes across all trades.  This metric tracks the cumulative sequential changes across all trades.  In other words, a poor performance for one trade adversely affects the current trade and the subsequent trade because the account balance is less for the subsequent trade.

As you can see from the following table, the MUU ticker outperforms its underlying ticker (MU) better than any other ticker pair.  In contrast, the AMZU ticker underperforms its underlying ticker (AMZN) by a larger margin than for any other ticker pair.  This pattern holds true across both metrics.



To further reinforce the basis for the results from the preceding table, performance charts prepared at Finviz.com are displayed for each matching pair of tickers over the lifespan of its ETF ticker member.  These charts display the percentage change for each trading day relative to the inception date for the ETF ticker in each pair.  The charts show that the AMZN/AMZU pair is the only pair for which the leveraged ETF member underperforms the underlying member.  Furthermore, the MU/MUU pair shows greater outperformance for its ETF member than any other ticker pair.




























The next four screenshots display four trades table sections – one for each pair of tickers analyzed in this post.

  • The first two columns of each trades table section denote the underlying ticker and single-stock ETF values.
  • The second two columns reveal entry and exit dates for each trade created by the POGL model.
  • The third two columns show the underlying and ETF return percentages for each trade.
  • The Rungs Climbed column indicates the number of rungs climbed within a trade.

The number of rungs climbed per trade appears to have a dominant impact on return percentage value.  Three points supporting this conclusion are:

  • All trades with zero rungs climbed have negative return percentages,
  • Every trade with at least one rung climbed, except for one, has a positive return percentage, and
  • Among the trades for each ticker pair, larger percentage returns are associated with trades having more rungs climbed.











Concluding Comments

This blog’s second post on single-stock leveraged ETFs confirms that active traders who follow a strategy like that for the POGL model can generally benefit from trading single-stock ETFs.  This is because three of four ticker pairs showed the POGL model locked in sufficient gains for single-stock ETF tickers to outperform their underlying unleveraged counterpart tickers.  This outcome does not deny the risks associated with single-stock ETFs, but it does demonstrate that a well-constructed trading strategy can offset these risks.

Additionally, the Performance charts from Finviz.com illustrated single-stock ETFs can outperform their underlying security for three of four comparisons using a simple buy-and-hold strategy.  This outcome is likely when the underlying stock has consistently rising price performance.  Therefore, even without special measures to counteract leveraged losses on trading days with falling price, volatility drag, and higher management fees associated daily rebalancing, single-stock ETFs may be appropriate for long-term traders who can pick winning stocks in the long run and not be dissuaded from holding the ETFs during downturns.

As with any asset class, you need to perform due diligence in security selection and risk management for when to cash in gains, but the results presented in this post suggest the POGL model can help you address both issues.  Further research with more ticker pairs is warranted to verify the consistency of the results across a broad range of single-stock leveraged ETFs.

Code Appendix

There are three subsections to this code appendix section.

  • The first subsection (Code for the POGL Model) contains the full script for the POGL model.  It is configured to run for the four pairs of tickers tracked in this post.  There are comments in the script and an additional commentary on the strategy implemented by the script in the body of this post.

  • The second subsection (Code for the Dynamic Ingestion of Multiple CSV Files into a Tall SQL Table ver_5) displays a script that implements an automated process for importing CSV files exported from an Excel workbook file into SQL Server table.  A useful reference for learning more about this script is the “How to Populate a Trading Database with Refinitiv, Excel, and SQL Server—Update 2” post.

  • The third subsection (Create and Populate dbo.stockhistory_with_emas) presents a script that focuses on how to amend the output from the second script with exponential moving averages as they are commonly computed for stock market analyses.  A useful reference for learning more about this script is the “Steps for Compiling a Source Dataset for the Model” section in the “An Initial Evaluation of Buy on Proper Order and Sell on Dynamic Stop Loss Orders” post.

Code for the POGL Model

 

-- running the Proper Growth with Gain Lock-ins Models ver 7

-- for single-stock ETFs versus their Underlying Stocks

 

use securitytradinganalytics;

go

 

-- Step 1: Mapping Table for Analysis Pairs

if object_id('tempdb..#ticker_map') is not null drop table #ticker_map;

create table #ticker_map (u_ticker varchar(10), e_ticker varchar(10));

insert into #ticker_map values

('mu','muu'), ('nvda','nvdl'), ('amzn','amzu'), ('pltr','ptir');

 

-- Step 2: Global Results Table

if object_id('tempdb..#proper_order_results') is not null

             drop table #proper_order_results;

create table #proper_order_results (

u_ticker varchar(10),

e_ticker varchar(10),

buy_dt date,

sell_dt date,

u_return_pct float,

e_return_pct float,

lockin_rungs int

);

 

-- Step 3: Integrated Loop Logic

declare @u_tkr varchar(10), @e_tkr varchar(10);

declare cur_pairs cursor for select u_ticker, e_ticker from #ticker_map;

 

open cur_pairs;

fetch next from cur_pairs into @u_tkr, @e_tkr;

 

while @@fetch_status = 0

begin

if object_id('tempdb..#u_data') is not null drop table #u_data;

select

     row_number() over (order by [date]) as row_id,

     [date], [open], [close], ema_21,

     ema_63 as ema_slow,

     lead([open]) over (order by [date]) as nxt_o

into #u_data

from dbo.stockhistory_with_emas

where ticker = @u_tkr;

 

declare @i int = 1, @max_i int = (select count(*) from #u_data);

declare @in_pos bit = 0, @u_ent decimal(18,4), @b_dt date,

                             @low_b decimal(18,4), @up_b decimal(18,4), @lvls int;

 

while @i <= @max_i

begin

     declare @d date, @uc decimal(18,4), @ue21 decimal(18,4),

                                             @ueslow decimal(18,4), @unxto decimal(18,4);

     select @d=[date], @uc=[close], @ue21=ema_21, @ueslow=ema_slow,

                                             @unxto=nxt_o from #u_data where row_id=@i;

 

     if @in_pos = 0

     begin

         if @uc > @ue21 and @ue21 > @ueslow and @unxto is not null

         begin

             set @in_pos=1; set @b_dt=@d; set @u_ent=@unxto;

                                                             set @low_b=@u_ent*0.95; set @up_b=@u_ent*1.10; set @lvls=0;

         end

     end

     else

     begin

         -- The "Gain Lock-in" Logic: Climbing the Rungs

         if @uc >= @up_b begin set @lvls=@lvls+1; set @low_b=@up_b*0.95;

                                                             set @up_b=@up_b*1.10; end

        

         -- The Exit Logic: Stop Loss at the Lock-in Floor

         if @uc <= @low_b and @unxto is not null

         begin

             declare @e_ent decimal(18,4), @e_exit decimal(18,4);

            

             set @e_ent = (select [open] from dbo.stockhistory_with_emas

                           where ticker = @e_tkr and [date] =

                                                                                                               (select [date] from #u_data where row_id =

                                                                                                               (select row_id + 1 from #u_data where [date] = @b_dt)));

            

             set @e_exit = (select [open] from dbo.stockhistory_with_emas

                            where ticker = @e_tkr and [date] =

                                                                                                               (select [date] from #u_data where row_id =

                                                                                                               (select row_id + 1 from #u_data where [date] = @d)));

 

             if @e_ent is not null and @e_exit is not null

             begin

                 insert into #proper_order_results values (@u_tkr, @e_tkr, @b_dt, @d,

                     ((cast(@unxto as float)-@u_ent)/@u_ent)*100,

                                                                                             ((cast(@e_exit as float)-@e_ent)/@e_ent)*100, @lvls);

             end

             set @in_pos = 0;

         end

     end

     set @i = @i + 1;

end

fetch next from cur_pairs into @u_tkr, @e_tkr;

end

close cur_pairs; deallocate cur_pairs;

 

 

 

-- Final Performance Summary: Summed and Compounded Growth

select

u_ticker as [Underlying],

e_ticker as [SS ETF],

count(*) as [Trade Count],

-- Summed Growth (Conservative)

cast(sum(u_return_pct) as decimal(18,2)) as [Total Summed U %],

cast(sum(e_return_pct) as decimal(18,2)) as [Total Summed E %],

-- Compounded Growth (The Realized Account Power)

cast((exp(sum(log(1 + (u_return_pct / 100)))) - 1) *

                             100 as decimal(18,2)) as [Compounded U %],

cast((exp(sum(log(1 + (e_return_pct / 100)))) - 1) *

                             100 as decimal(18,2)) as [Compounded E %],

sum(lockin_rungs) as [Total Lock-ins]

from #proper_order_results

group by u_ticker, e_ticker

order by [Compounded E %] desc;

 

-- View the combined results for all 4 pairs with Gain Lock-in

select

u_ticker as [Underlying],

e_ticker as [SS ETF],

buy_dt as [Proper Order Entry],

sell_dt as [Gain Lock-in Exit],

cast(u_return_pct as decimal(18,2)) as [U Return %],

cast(e_return_pct as decimal(18,2)) as [ETF Return %],

lockin_rungs as [Rungs Climbed]

from #proper_order_results

order by [Underlying], [Proper Order Entry];

Code for the Dynamic Ingestion of Multiple CSV Files into a Tall SQL Table ver_5

 

/* dynamic ingestion of multiple csv files into a tall sql table ver_5

             author: rick dobson

             purpose: load csv files and filter out pre-exchange #n/a rows.

*/

 

use securitytradinganalytics;

go

 

---------------------------------------------

-- 1. recreate staging table

---------------------------------------------

if object_id('dbo.stockhistory_stage') is not null

             drop table dbo.stockhistory_stage;

 

create table dbo.stockhistory_stage

(

             [date]  varchar(50),

             [open] varchar(50),

             [close]   varchar(50),

             [ticker]  varchar(50)

);

 

---------------------------------------------------------

-- 2. recreate clean table

---------------------------------------------------------

if object_id('dbo.stockhistory_clean') is not null

             drop table dbo.stockhistory_clean;

 

create table dbo.stockhistory_clean

(

             [date]  date    not null,

             [open] float   not null,

             [close]   float   not null,

             [ticker]  varchar(20) not null,

             constraint pk_stockhistory_clean primary key clustered (ticker, date)

);

 

---------------------------------------------------------

-- 3. user configuration

---------------------------------------------------------

declare @folder_path varchar(500) =

             'c:\users\user\documents\csv_from_xlsource_04_24_2026\';

 

if object_id('tempdb..#tickers') is not null drop table #tickers;

create table #tickers (ticker varchar(20));

 

-- select * from #tickers

insert into #tickers (ticker)

values ('MUU'), ('MU'), ('NVDL'), ('NVDA'),

                             ('AMZU'), ('AMZN'), ('PTIR'), ('PLTR');

---------------------------------------------------------

-- 4. loop through files

---------------------------------------------------------

declare @ticker varchar(20);       -- needed for the fetch

declare @file_path varchar(500);  -- needed for the fetch

declare @sql nvarchar(max);       -- needed for the bulk insert

 

declare file_cursor cursor for

             select ticker, @folder_path + ticker + '.csv' from #tickers;

 

open file_cursor;

fetch next from file_cursor into @ticker, @file_path;

 

while @@fetch_status = 0

begin

-- 4a. clear staging table for the current ticker

             truncate table dbo.stockhistory_stage;

 

             -- 4b. dynamic bulk insert

             -- nchar(39) is a single quote. this avoids nesting errors.

             set @sql = '

        bulk insert dbo.stockhistory_stage

        from ' + nchar(39) + @file_path + nchar(39) + '

        with (

         firstrow = 2,

         fieldterminator = ' + nchar(39) + ',' + nchar(39) + ',

         rowterminator = ' + nchar(39) + '0x0a' + nchar(39) + ',

       tablock

        );';

 

             exec sp_executesql @sql;

 

             -- 4c. filters out non-numeric values, such as #N/A, for historic prices

             insert into dbo.stockhistory_clean (date, [open], [close], ticker)

             select

     try_convert(date, [date]),

        try_convert(float, replace(replace([open], '$', ''), ',', '')),

     try_convert(float, replace(replace([close], '$', ''), ',', '')),

        @ticker

             from dbo.stockhistory_stage

             where

     try_convert(date, [date]) is not null

        and try_convert(float, replace(replace([open], '$', ''), ',', ''))

                                             is not null

        and try_convert(float, replace(replace([close], '$', ''), ',', ''))

                                             is not null;

 

             fetch next from file_cursor into @ticker, @file_path;

end

 

close file_cursor;

deallocate file_cursor;

 

---------------------------------------------------------

-- 5. validation

---------------------------------------------------------

select ticker, count(*) as row_count, min(date) as first_trade,

             max(date) as last_trade

from dbo.stockhistory_clean

group by ticker

order by ticker;

Create and Populate dbo.stockhistory_with_emas

 

use SecurityTradingAnalytics

go

 

-- use in coordination with

-- Stored Procedure for Calculating Emas Based on a SMA Seed Value section

-- from Calculate Exponential Moving Average in SQL Server in MSSQLTips.com

-- at https://www.mssqltips.com/sqlservertip/11544/exponential-moving-average-in-sql-server/

 

-- truncate dbo.ticker_date_prices

-- load dbo.stock_history_clean

 

truncate table dbo.ticker_date_prices

insert into dbo.ticker_date_prices

select ticker, date, [close]

from [dbo].[stockhistory_clean]

 

-- Execute to calculate 21-day and 63-day emas

-- for dbo.ticker_date_prices

-- based on [dbo].[stockhistory_clean]

 

truncate table dbo.ticker_ema_history_from_sma_seed

exec dbo.usp_calculate_ema_from_sma_seed @period = 21;

exec dbo.usp_calculate_ema_from_sma_seed @period = 63;

 

-- optionally display calculated emas for [dbo].[stockhistory_clean]

/*

select *

from dbo.ticker_ema_history_from_sma_seed

order by ticker, period, date

 

-- display 21-day emas

select *

from dbo.ticker_ema_history_from_sma_seed

where period = 21

order by ticker, date

 

-- display 6-day emas

select *

from dbo.ticker_ema_history_from_sma_seed

where period = 63

order by ticker, date

*/

 

-- drop the table if it already exists to allow for re-runs

if object_id('dbo.stockhistory_with_emas', 'u') is not null

             drop table dbo.stockhistory_with_emas;

 

-- create the new table by joining the data

-- from dbo.stockhistory_with_emas

-- with data from dbo.stockhistory_clean

 

select

             s.ticker,

             s.date,

             s.[open],

             s.[close],

             ema21.ema as ema_21,

             ema63.ema as ema_63

into dbo.stockhistory_with_emas

from dbo.stockhistory_clean s

left join (select ticker, date, ema

               from dbo.ticker_ema_history_from_sma_seed

   where period = 21

) ema21

on s.ticker = ema21.ticker

and s.date = ema21.date

left join (select ticker, date, ema

               from dbo.ticker_ema_history_from_sma_seed

               where period = 63

) ema63

             on s.ticker = ema63.ticker

             and s.date = ema63.date;

 

-- display dbo.stockhistory_with_emas

select * from dbo.stockhistory_with_emas;


Comments

Popular posts from this blog