Skip to main content
Home
Home

Artificial Intelligence: SEC Proposals and Concerns

Asset Management ADVocate

Artificial Intelligence: SEC Proposals and Concerns

Digital stock ticker

Proposed Rules

The US Securities and Exchange Commission (SEC) indicated this summer that it plans to introduce proposals to regulate conflicts of interest associated with artificial intelligence (AI) later this year as part of its semiannual rule-writing agenda. The SEC is considering proposed rules related to the following:

  • Broker-dealer conflicts in the use of predictive data analytics, AI, machine learning, and similar technologies in connection with certain investor interactions.[1]
  • Investment adviser conflicts in the use of predictive data analytics, AI, machine learning, and similar technologies in connection with certain investor interactions.[2]

SEC's Key Areas of Concern

1. Conflicts. Conflicts of interest posed by AI have long been a concern for SEC Chair Gary Gensler, and he has repeatedly expressed concerns over whether brokers and financial advisors using AI can make recommendations that are in the best interests of their clients. When a broker or an adviser provides advice to a client, whether they utilize some form of AI or not, they must act in the best interests of their clients and not place their own interests ahead of their clients' interests. Gensler is concerned about whether algorithms optimize for the investor's interests and place the investor's interests in front of the adviser's interests.[3]

Notably, in late July, the SEC proposed new rules that would require broker-dealers and investment advisers, registered or required to be registered under section 203 of the Investment Advisers Act of 1940, to take certain steps to address conflicts of interest associated with their use of predictive data analytics and similar technologies.[4] Specifically, the proposal requires such firms to eliminate or neutralize the effect of conflicts of interest associated with the firm's use of technologies that optimize for, predict, guide, forecast, or direct investment-related behaviors that result in investor interactions that place the interest of the firm or its associated persons ahead of investors' interests (whether intentionally or unintentionally).[5] The proposal also mandates any firm using these types of technologies to adopt written polices and procedures reasonably designed to prevent violations and achieve compliance with the proposed rules and includes a requirement to test the technology  to determine whether it could give rise to a conflict of interest.[6]

2. Systemic Risk. Gensler has highlighted in many of his speeches that too much concentration among AI programs could pose a potential systemic risk to the financial system.[7]  In 2020, as a professor at MIT, Gensler wrote a working paper warning of the systemic risks posed by broad adoption of deep learning in finance.[8] Notably, Gensler warned that regulation could inadvertently cause problems and wrote, "challenges of explainability, fairness, and robustness may lead to regulatory gaps as well as how regulatory design may promote homogeneity in deep learning models…regulatory approaches to address these challenges inadvertently may lead to…model uniformity due to standardization of regulatory requirements."[9]  Gensler stated that a future financial crisis could be sparked "because everything was relying on one base level, what's called (the) generative AI level, and a bunch of fintech apps are built on top of it."[10] AI technology could pose a systemic risk in the very near future if there are concentrated AI data aggregators and concentrated generative AI and a widely used platform makes an error.[11]

3. Bias and Misinformation. The SEC and Office of the Comptroller of The Currency (OCC) have acknowledged that the use of AI in the financial sector raises unique ethical issues.[12]  AI is dependent on data input, and therefore, the program developers need to prevent the program from incorporating data that reinforces historical inequities and reflects bias, affecting fair access and prices in the markets.[13] Michael J. Hsu, acting Comptroller of the Currency, also flagged that AI has the capacity to enable fraud and the spread of misinformation.[14] Gensler also wrote that  "the outcomes of its predictive algorithms may be based on data reflecting historical biases as well as latent features which may inadvertently be proxies for protected characteristics."[15]

4. Accountability. Given AI's ability to teach itself, when it learns and moves further from its initial programming, who should be held accountable if it makes an error?[16] Providers should be capable of answering these questions as their AI initiatives expand.[17] Gensler's 2020 working paper noted that deep learning models outcomes are often unexplainable and that "human agency and traditional intervention approaches may be lost as a consequence of lack of model explainability and transparency."[18]

IAC Recommendations

In attempting to address these concerns, the SEC's Investor Advisory Committee (IAC) has recently advocated for additional measures to encourage the SEC to focus on the tenets of equity, consistent and persistent testing, and governance and oversight when developing additional AI guidance.[19] In furtherance of these tenets, the IAC has recommended that the SEC hire additional employees with AI and machine learning expertise.[20] The IAC also encouraged the SEC to draft and publish best practices regarding the use of AI and to expand guidance on the unique aspects of algorithm-based investment models, including enhanced monitoring and/or conducting risk-based reviews of the use of AI.[21]

Robo-Advisers and Algorithmic Trading

Looking ahead, it's likely that SEC regulations targeting robo-advisers and algorithmic trading can be used as guidance when looking at future regulation. For example, the SEC has requested comment as to whether index and model portfolio providers, as well as pricing services, should be considered investment advisers.[22] Although the request for comment does not specifically mention AI, it implicitly raises AI-related questions to the extent that providers use AI to perform their services. While it may not be necessary for these types of providers to register as investment advisers, and they may nonetheless be exempt under the publisher's exclusion, the SEC could mandate that the developers of these types of services register with the SEC.[23]

The SEC has already addressed their treatment of automated advisors, which are often referred to as robo-advisers, and determined that they should be treated as traditional SEC-registered investment advisers, as defined by the Investment Adviser Act of 1940 (Advisers Act). Due to this categorization, and the robo-adviser's unique business model, which includes reliance on algorithms and limited, if any, interaction with clients, the SEC's Division of Investment Management and their Office of Compliance Inspections and Examinations outlined the following guidance for which robo-advisers should consider to ensure compliance with the Advisers Act.[24]

  1. Substance and presentation of disclosures to clients about robo-advisers and the investment advisory services it offers.[25]
  2. Obligation to obtain information from clients to support the robo-adviser's duty to provide suitable advice.[26]
  3. Adoption and implementation of effective compliance programs reasonably designed to address particular concerns relevant to providing automated advice.[27]

These considerations are especially relevant because, under the Advisers Act, robo-advisers owe a fiduciary duty to their clients.[28] Accordingly, key aspects of this fiduciary duty are for the robo-adviser to fully disclose its offerings and to not deviate from the client's stated investment objective.[29]

Similarly, the SEC has regulated algorithmic trading. In doing so, the SEC required algorithmic trading developers to register as "Securities Traders" if they are associated with a FINRA member and are primarily responsible for the design, development, or significant modification of algorithmic trading strategies or are responsible for supervising or directing such activity.[30] Under the rule, "algorithmic trading strategy" is defined as an automated system that generates or routes orders and order-related messages.[31] Further, the rule is designed to increase market transparency and accountability for firms engaged in electronic trading.[32]

Use of AI by the SEC

Like many other organizations, the SEC is adopting these new technologies to analyze complex data.[33] Scott W. Bauguess, the Deputy Director and Deputy Chief Economist for the Division of Economic and Risk Analysis (DERA), stated, "we have begun a host of new initiatives that leverage the machine learning approach to behavioral predictions, particularly in the area of market risk assessment, which includes the identification of potential fraud and misconduct."[34] For example, the Corporate Issuer Risk Assessment (CIRA) program was developed to detect anomalous patterns in financial reporting.[35]

While new technology may flag a filing as high risk, the classification alone does not serve as a clear indicator of potential wrongdoing. Rather, as Bauguess explained, the data collected via machine learning methods helps the SEC prioritize examinations so that they can direct resources to areas of the market that are the most susceptible to potential violative conduct.[36] Thus, as of now, machine learning methods do not generally point to a particular action or conduct indicative of fraud or other violations. The SEC staff must still look for the human element of fraud and scienter.[37]

The next article in this series will discuss FINRA's view of AI technology.

The author wishes to acknowledge the contributions of summer associate Henry Little.


[1] Securities and Exchange Commission, Prohibition of Conflicted Practices for Broker-Dealers That Use Certain Covered Technologies.

[2] Securities and Exchange Commission, Prohibition of Conflicted Practices for Investment Advisers That Use Certain Covered Technologies.

[3] Gary Gensler, Investor Protection in a Digital Age, Remarks Before the 2022 NASAA Spring Meeting & Public Policy Symposium (May 17, 2022).

[4] Securities and Exchange Commission, SEC Proposes New Requirements to Address Risks to Investors From Conflicts of Interest Associated With the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisers(July 26, 2023).

[5] Securities and Exchange Commission, Fact Sheet – Conflicts of Interest and Predictive Data Analytics.; Securities and Exchange Commission, Proposed Rule – Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker-Dealers and Investment Advisors (July 26, 2023).

[6] Fact Sheet, supra note 5.

[7] Id.

[8] Gary Gensler and Lily Bailey, Deep Learning and Financial Stability, Working Paper (November 1, 2020).

[9] Id.

[10] John Divine, How AI Could Spark the Next Financial Crisis, U.S. News (June 30, 2023).

[11] Id.

[12] Gensler, supra note 3; Acting Comptroller of the Currency Michael J. Hsu, Remarks to the American Bankers Association (ABA) Risk and Compliance Conference "Tokenization and AI in Banking: How Risk and Compliance Can Facilitate Responsible Innovation 9 (Jun. 16, 2023).

[13] Gensler, supra note 3.

[14] Hsu, supra note 12 at 11.

[15] Gensler, supra note 8.

[16] Hsu, supra note 12 at 9-10.

[17] Id.

[18] Gensler, supra note 8.

[19] Christopher Mirabile & Leslie Van Buskirk, Establishment of an Ethical Artificial Intelligence Framework for Investment Advisors, U.S. Securities and Exchange Commission Investor Advisory Committee 2 (Apr. 6, 2023).

[20] Id.at 3.

[21] Id.

[22] Securities and Exchange Commission, Request for Comment on Certain Information Providers Acting as Investment Advisers 3 (Aug. 16, 2022).

[23] Investment Advisers Act of 1940, 76th Cong. § 2(a)(11).

[24] U.S. Securities and Exchange Commission Division of Investment Management, Guidance Update 2 (Feb. 2017).

[25] Id.

[26] Id.

[27] Id.

[28] Id.

[29] U.S. Securities and Exchange Commission, Information for Newly-Registered Investment Advisers (Nov. 23, 2010).

[30] Bloomberg Professional Services, SEC Approves FINRA rule requiring registration of algorithmic trading developers (Apr. 20, 2016).

[31] U.S. Securities and Exchange Commission, Order Approving a Proposed Rule Change to Require Registration as Securities Traders of Associated Persons Primarily Responsible for the Design, Development, Significant Modification of Algorithmic Trading Strategies or Responsible for the Day-to-Day Supervision of Such Activities (Apr. 7, 2016).

[32] Bloomberg Professional Services, supra note 30.

[33] Scott W. Baugess, Has Big Data Made Us Lazy?.

[34] Id.

[35] Mark J. Flannery, Insights into the SEC's Risk Assessment Programs.

[36] Baugess, supra note 33.

[37] Id.

Print and share

Explore more in

Blog series

Asset Management ADVocate

The Asset Management ADVocate provides unique analysis and insight into legal developments affecting asset managers in the United States.

View the blog
Home
Jump back to top