There are 4 steps to get your Florida real estate agent license (also known as a real estate sales associate license):
Schedule an appointment to have your fingerprints taken by a Livescan Service Provider registered with FDLE (Florida Department of Law Enforcement). It can take up to 5 days for your fingerprints to be processed. Expect to pay about $37-60 to get your prints taken. Be sure to know your social security number when you go, bring your ID, and use the following ORI number to ensure you prints get to the Florida DBPR Division of Real Estate: FL920010Z.
Submit your application (Form RE 1) to sit for the real estate sales associate exam. To obtain the fastest turnaround time, try to do this 1-2 days after you get fingerprinted. You are allowed to (and should) submit this application before taking the required 63-hour pre-license course. The fee for submitting an online application is $83.75.
Take and pass a 63-hour pre-license course. You can do this online. There are hundreds of different companies that offer versions of this course to choose from, and you can search through them here. Courses typically cost anywhere from $100-500. At the end of the course, you must pass the end-of-course exam with a score of at least 70%. This is a different exam from the state licensing exam.
Schedule and pass the state licensing exam. The exam is administered by Pearson Vue. You will need to show your 63-hour course completion slip in order to sit for the exam. You will also need to have received notice that your state application to take the exam (step 2) was accepted and approved. The exam consists of 100 multiple choice questions which you have 3.5 hours to answer. You must get a score of at least 75% to pass.
Florida Statute 475.17 specifies the requirements that you must meet in order for your application to take the real estate exam (step 2 from above) to be approved:
Be at least 18 years old,
Have a U.S. social security number (you don’t need to be a U.S. citizen, but you DO need to have an SSN),
Hold a high school diploma or equivalent (e.g. a GED),
Be honest, trustworthy, and of good character, and
Have a good reputation for fair dealing
Certain types of (but not all) criminal histories will be taken as evidence that an applicant is not honest, trustworthy, of good character, or that they do not have a good reputation for fair dealing. In order to be licensed, applicants must also be fingerprinted and subjected to a criminal background check. If an applicant lies about their criminal history, that is grounds for immediate denial of a real estate license.
Additionally, the following events are almost always grounds for disqualification (although the Florida Real Estate Commission may choose to overlook them in certain situations after enough time has passed):
The applicant was previously denied a real estate license in Florida or another state,
The applicant was previously disbarred in Florida or another state,
The applicant’s registration or license to practice any regulated profession, business, or vocation was previously revoked or suspended in Florida or another state,
The applicant has acted, attempted to act, or held themself out as entitled to act as a real estate broker or sales associate in Florida, despite not having a license, within the 1 year before applying for a license. This is disqualifying even if no compensation was collected for these acts.
NOTE: If you hold a 4-year degree (or graduate degree) in real estate, you may be exempt from the pre-license course requirement.
What Disqualifies You From Being a Real Estate Agent in Florida?
Below is a (non-exhaustive) list of actions that can disqualify you from obtaining a real estate license:
You have previously been convicted of any type of fraud, misrepresentation, dishonest dealing, culpable negligence, breach of trust, or operating under false business pretenses in any business transaction in any state, nation, or territory.
You failed to deliver money, personal property, or documents after being ordered to do so by a court or after being required to do so by law. If you were previously involved in divorce proceedings and did not turn over required documents, money, or property on time, you might be disqualified from becoming a real estate agent.
You have been convicted of, found guilty of, or entered a plea of nolo contendere to any crime in any jurisdiction which directly relates to the activities of a licensed real estate broker or real estate sales associate, or which involves moral turpitude or dishonest dealing. Crimes likely to evidence moral turpitude include spousal abuse, domestic violence, rape, incest, failure to register as a sex offender, aggravated assault, drug-related crimes, animal cruelty, animal fighting, robbery & theft, felony hit and run, arson, prostitution, perjury, bribery, embezzlement, and driving while intoxicated without a license.
Appendix A: Excerpts from F.S. 475.17 relevant to initial licensure as a Florida real estate sales associate
(1)
(a) An applicant for licensure who is a natural person must be at least 18 years of age; hold a high school diploma or its equivalent; be honest, truthful, trustworthy, and of good character; and have a good reputation for fair dealing.
An applicant for an active sales associate’s license must be competent and qualified to make real estate transactions and conduct negotiations therefor with safety to investors and to those with whom the applicant may undertake a relationship of trust and confidence.
If the applicant has been denied registration or a license or has been disbarred, or the applicant’s registration or license to practice or conduct any regulated profession, business, or vocation has been revoked or suspended, by this or any other state, any nation, or any possession or district of the United States, or any court or lawful agency thereof, because of any conduct or practices which would have warranted a like result under [F.S. chapter 475], or if the applicant has been guilty of conduct or practices in this state or elsewhere which would have been grounds for revoking or suspending her or his license under [F.S. chapter 475] had the applicant then been registered, the applicant shall be deemed not to be qualified unless, because of lapse of time and subsequent good conduct and reputation, or other reason deemed sufficient, it appears to the Florida Real Estate Commission that the interest of the public and investors will not likely be endangered by the granting of registration.
The Commission may adopt rules requiring an applicant for licensure to provide written information to the Commission regarding the applicant’s good character.
(b) An application may be disapproved if the applicant has acted or attempted to act, or has held herself or himself out as entitled to act, during the period of 1 year next prior to the filing of the application, as a real estate broker or sales associate in Florida in violation of [F.S. chapter 475]. This paragraph may be deemed to bar any person from licensure who has performed any of the acts or services described in s. 475.01(3), unless exempt pursuant to s. 475.011, during a period of 1 year next preceding the filing of the application, or during the pendency of the application, and until a valid current license has been duly issued to the person, regardless of whether the performance of the act or service was done for compensation or valuable consideration.
(2)
(a)
1. In addition to other requirements under this part, the Commission may require the satisfactory completion of one or more of the educational courses or equivalent courses conducted, offered, sponsored, prescribed, or approved pursuant to s. 475.04, taken at an accredited college, university, or community college, at a career center, or at a registered real estate school, as a condition precedent for any person to become licensed or to renew her or his license as a sales associate. The course or courses required for one to become initially licensed shall not exceed a total of 63 classroom hours of 50 minutes each, inclusive of examination, for a sales associate… The satisfactory completion of an examination administered by the accredited college, university, or community college, by a career center, or by the registered real estate school shall be the basis for determining satisfactory completion of the course. However, notice of satisfactory completion shall not be issued if the student has absences in excess of 8 classroom hours.
2. A distance learning course or courses shall be approved by the Commission as an option to classroom hours as satisfactory completion of the course or courses as required by this section. The schools authorized by this section have the option of providing classroom courses, distance learning courses, or both. However, satisfactory completion of a distance learning course requires the satisfactory completion of a timed distance learning course examination. Such examination shall not be required to be monitored or given at a centralized location.
3. Such required course or courses must be made available by correspondence or other suitable means to any person who, by reason of hardship, as defined by rule, cannot attend the place or places where the course or courses are regularly conducted or does not have access to the distance learning course or courses.
(6) The education course requirements for one to become initially licensed do not apply to any applicant who has received a 4-year degree, or higher, in real estate from an accredited institution of higher education.
Appendix B: Definition of a real estate broker under Florida law
“Broker” means a person who, for another, and for a compensation or valuable consideration directly or indirectly paid or promised, expressly or impliedly, or with an intent to collect or receive a compensation or valuable consideration therefor, appraises, auctions, sells, exchanges, buys, rents, or offers, attempts or agrees to appraise, auction, or negotiate the sale, exchange, purchase, or rental of business enterprises or business opportunities or any real property or any interest in or concerning the same, including mineral rights or leases, or who advertises or holds out to the public by any oral or printed solicitation or representation that she or he is engaged in the business of appraising, auctioning, buying, selling, exchanging, leasing, or renting business enterprises or business opportunities or real property of others or interests therein, including mineral rights, or who takes any part in the procuring of sellers, purchasers, lessors, or lessees of business enterprises or business opportunities or the real property of another, or leases, or interest therein, including mineral rights, or who directs or assists in the procuring of prospects or in the negotiation or closing of any transaction which does, or is calculated to, result in a sale, exchange, or leasing thereof, and who receives, expects, or is promised any compensation or valuable consideration, directly or indirectly therefor; and all persons who advertise rental property information or lists. A broker renders a professional service and is a professional within the meaning of s. 95.11(4)(b). Where the term “appraise” or “appraising” appears in the definition of the term “broker,” it specifically excludes those appraisal services which must be performed only by a state-licensed or state-certified appraiser, and those appraisal services which may be performed by a registered trainee appraiser as defined in part II. The term “broker” also includes any person who is a general partner, officer, or director of a partnership or corporation which acts as a broker. The term “broker” also includes any person or entity who undertakes to list or sell one or more timeshare periods per year in one or more timeshare plans on behalf of any number of persons, except as provided in ss. 475.011 and 721.20.
Appendix C: Definition of a real estate sales associate under Florida law?
“Sales associate” means any person who performs any act specified in Appendix B in the definition of “broker”, but who performs such act under the direction, control, or management of another person.
Real Estate Broker Experience Requirements by State
Real estate professional licensing laws are individual to each state, but all states have at least two tiers of license. The lowest tier of license is usually called “sales associate” or “real estate”. The higher level is usually called “broker”. In some states like Illinois and Washington, the lower level is actually called “broker” while the higher level is called “managing broker”. However, the functionality of what the lower tier license allows you to do is mostly the same across states, regardless of whether lower tier licensees are referred to as sales associates, brokers, or something else.
The lower tier license gives you the ability to buy and sell real estate on a client’s behalf, but only under the supervision of someone with a higher-tier license. The higher tier license (what most states call a “broker’s license”) gives you the right to work independently without the oversight of another real estate professional.
Usually, the second (higher) tier license also gives you the ability to hire, supervise, and manage lower tier licensees. However, some states such as New Mexico actually have three tiers of licenses: (1) a license that lets you work under the supervision of another professional, (2) a license that lets you work independently, and (3) a license that lets you supervise other licensed real estate professionals.
In this article, I’m going to use the term “broker” to refer to whichever license type gives someone the ability to work independently as a real estate professional in a given state. That’s the 2nd tier license which is also usually the highest license in a state.
In order to become a licensed broker, each state has its own criteria you must satisfy. However, all states require some degree of prior experience as some sort of real estate license holder in the same or a different state before you can apply to become a broker. The map below shows the amount of time you need to be a first-tier (low level) licensee in a state, as a resident of that state, before you can apply to become a real estate broker in that state.
Importantly, this map only represents the time experience requirements. In some states such as Florida, the only experience requirement to become a real estate broker is to hold an active real estate sales associate license for at least 2 years. However, in other states such as Utah, you also need to accumulate a certain number of “experience points” which are gained from participating in real estate transactions.
If you buy a house as your personal residence, the IRS does not allow you to deduct any HOA fees. However, if you buy a house for use as a rental property, the IRS does allow you to deduct HOA fees on your tax return. If you buy a house that you live in for 4 months of the year and list on Airbnb for the other 8 months of the year, then you can deduct two-thirds of your HOA fees (corresponding to the 8 out of 12 months when you are renting out the property) each year.
If you buy a timeshare, similar rules apply. If you pay HOA fees for a timeshare that you use personally, you cannot deduct those HOA fees from your taxable income. However, if you pay HOA fees for a timeshare that you list on Airbnb, then you can deduct the HOA fees.
Can I deduct HOA fees if I run a business from home?
Despite what I said earlier, you can deduct a percentage (not all) of your HOA fees on your personal residence if you work from home. The exact percentage you can deduct is based on the percentage of your space that you use for business. For example, if you have a 1,000 sq ft home and your home office takes up 120 sq ft, then you can deduct 12% of your HOA fees as a business expense.
However, there are some exceptions to this, such as if you live on a farm or other property with significant space outside of your house.
Are HOA fees tax deductible on a second home?
If you own a second home that you use for only a portion of the year, but you never rent out that second home (e.g. on Airbnb or VRBO) when you aren’t there, then you cannot deduct any of your HOA fees on that home. However, if you do rent out your second home (or at least try to by listing it on Airbnb), then you can deduct the HOA fees for the percentage of the year that the property is listed for rent. For example, if you use your second home for 6 months out of the year and list it on VRBO for the other 6 months, then you can deduct 50% of your HOA fees for that property.
The EU Digital Services Act (DSA) is a big piece of legislation, like GDPR was, so I’m going to leave a lot out and focus on summarizing. In particular, I’m going to answer the following questions:
Which companies are subject to the rules of the DSA?
What are those rules?
What is the penalty for noncompliance?
When is the deadline for companies to be in compliance with the DSA to avoid penalties?
What’s the difference between the EU’s Digital Services Act (DSA) and the EU’s Digital Markets Act (DMA)?
Let’s jump in.
Which companies are subject to the rules of the DSA?
Let’s start with some companies which are NOT subject to the DSA:
U.S. bloggers
U.S. e-commerce sellers (this doesn’t include marketplaces such as Amazon or Ebay)
The DSA only applies to companies which meet two conditions:
The company is an “intermediary service provider”, and
The company has a “substantial connection” to the European Union.
There are additional rules for companies which also qualify as “online platforms”, qualify as “online search engines”, and/or qualify as “very large”, and there are a few less rules for certain companies which are deemed to be “micro or small enterprises”.
Intermediary service provider
Intermediary service means one of the following “information society services” (defined in a few paragraphs):
A mere conduit service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
A caching service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate, and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request; and
A hosting service, consisting of the storage of information provided by, and at the request of, a recipient of the service.
Examples of mere conduit services:
A private messaging app like Facebook Messenger
Examples of caching services:
Content delivery networks (CDNs)
Examples of hosting services:
Cloud storage providers like Dropbox
Cloud computing providers like AWS, GCP, and Azure
An information society service is any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. The definition is the same as provided in Article 1(1), point (b), of Directive (EU) 2015/1535.
For the purposes of this definition:
‘at a distance’ means the service is provided without the parties being simultaneously present;
‘by electronic means’ means that the service is sent initially and received at its destination by means of electronic equipment for the processing (including digital compression) and storage of data, and entirely transmitted, conveyed, and received by wire, radio, optical, or other electromagnetic means;
‘at the individual request of a recipient of services’ means that the service is provided through the transmission of data on individual request.
Examples of information society services:
eBay
Hotels.com
Google Search
Uber
Non-Examples of information society services:
A taxi driver providing transportation services (service is not provided ‘at a distance’ from the passenger)
A medical examination at a doctor’s office using electronic equipment where the patient is physically present (service is not provided ‘at a distance’).
Telemarketing done by a human (service is not considered to be provided by ‘electronic means’)
Radio broadcasting (service is not supplied ‘at the individual request of a recipient of services’)
Substantial connection to the EU
A company is deemed to have a “substantial connection” to the EU if it meets any of the following conditions:
The service provider has an establishment in the EU,
The number of consumer & business users of the service in one or more EU member states is significant in relation to the population thereof, or
The service provider targets activities towards one or more EU member states.
Whether a service provider is considered to be targeting activities towards one or more member states is a matter of facts and circumstances, including the following factors:
The use of a language or currency generally used in a member state,
The possibility of residents of a member state ordering products or services,
The use of a relevant top-level domain,
The availability of an app in a relevant national app store,
The provision of local advertising,
The provision of advertising in a language used in a member state, or
Providing customer service in a language used in a member state.
A substantial connection will also be construed to exist if a service provider directs its activities to one or more member states within the meaning of Article 17(1), point (c) of Regulation (EU) No 1215/2012.
However, the mere technical accessibility of a website from the EU cannot, on that ground alone, be considered as establishing a substantial connection to the EU.
Online platforms
Online platforms are hosting service providers which also make hosted data publicly available. More specifically, an online platform is a provider of hosting services which, at the request of a user, stores and disseminates information to the public (unless that activity is a minor or purely ancillary feature of another service or a functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the Digital Services Act).
Examples of online platforms:
Social media apps (e.g. Facebook, Snapchat)
Online marketplaces (e.g. Amazon, eBay)
Non-Examples of online platforms:
The comment section of an online newspaper
Cloud computing, cloud storage, and web hosting services (i.e. web infrastructural services)
Email service providers
Private messaging apps
Online platforms are subject to additional rules beyond what all hosting service providers are subjected to.
Online search engines
An online search engine is an intermediary service provider that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject, in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found.
Notably, an online search engine is not necessarily an online platform or even a hosted service provider.
“Very large” online platforms and search engines
The DSA’s strictest rules only apply to “very large online platforms” (VLOPs) and “very large online search engines” (VLOSEs).
A VLOP is an online platform that, for at least 4 consecutive months, provides services to at least 45 million average monthly active users in the EU (corresponding to about 10% of the EU population), and that has been designated by the European Commission as a VLOP.
A VLOSE is an online search engine which reaches, on average, at least 45 million monthly active recipients of the service in the EU, and which has been designated as a VLOSE by the European Commission.
The European Commission has currently designated the following companies as VLOPs and VLOSEs:
Alphabet (Google Search, YouTube, Google App store, Google Shopping, Google Maps)
Microsoft (Bing, LinkedIn)
Meta (Facebook, Instagram)
Bytedance (TikTok)
Snap (Snapchat)
Pinterest
Twitter
Apple (Apple App Store)
Wikimedia (Wikipedia)
Amazon (Amazon Marketplace)
Booking.com
Alibaba (AliExpress)
Zalando
What rules do intermediary service providers with a substantial connection to the EU have to comply with under the DSA?
The DSA prescribes rules based on two dimensions: (1) company size, and (2) type of service provided by the company. The table below shows which articles of the DSA apply to which sizes and types of companies.
Table 1:
Micro and small enterprises
Companies that only allow consumers to buy from micro and small enterprises (which are not also very large companies)
All other companies
Very large companies
Mere conduit service providers
Articles 11-15
n/a
Articles 11-15
Articles 11-15
Caching service providers
Articles 11-15
n/a
Articles 11-15
Articles 11-15
Hosting service providers
Articles 11-15 Articles 16-18
n/a
Articles 11-15 Articles 16-18
Articles 11-15 Articles 16-18
Online platforms
Articles 11-15 Articles 16-18
Article 24(3)
Articles 11-15 Articles 16-18 Articles 20-28
Articles 11-15 Articles 16-18 Articles 20-28
Articles 11-15 Articles 16-18 Articles 20-28
Articles 33-43
Online platforms allowing consumers to buy goods or services
Articles 11-15 Articles 16-18
Article 24(3)
Articles 30-32
Articles 11-15 Articles 16-18 Articles 20-28
Articles 11-15 Articles 16-18 Articles 20-28
Articles 30-32
Articles 11-15 Articles 16-18 Articles 20-28
Articles 33-43
Online search engines
Articles 11-15
n/a
Articles 11-15
Articles 11-15
Articles 33-43
The primary obligations prescribed by the DSA fall into 4 categories: content moderation, fair (algorithm & business policy) design, transparency, and oversight. Table 2 below summarizes the obligations in each of those categories for each type of company subject to the DSA.
Table 2:
Obligations
Universal
All providers of conduit, caching, hosting services
Basic
All hosting services
Advanced
Medium to large online platforms
Special
VLOPs & VLOSEs
Content Moderation
Article 14 (fair content moderation)
Article 16 (notice)
Article 17 (statement of reasons)
Article 20 (internal redress)
Article 21 (out-of-court mechanism)
Article 22 (trusted flaggers)
Article 23 (anti-abuse provisions)
Articles 30-32 (specific rules on B2C marketplaces)
Article 34-35 (risk mitigation assessment)
Article 36 (crisis response mechanism)
Fair Design
(user interfaces, recommender systems, advertising, and other parts)
Article 14 (fair content moderation)
Article 16 (user-friendly notice and action)
Article 25 (fair design of user experience)
Article 26(3) (advertising)
Article 27 (recommender systems)
Article 28 (protection of minors)
Article 30 (traceability of traders)
Article 31 (facilitating design for traders)
Article 38 (recommender systems)
Article 39 (risk mitigation assessment)
Transparency
Article 15 (annual reporting)
Article 17(5) (database of all the statements of reasons)
Article 22 (reports by trusted flaggers)
Article 24 (content moderation reports)
Article 26 (advertising disclosure)
Article 39 (advertising archives)
Article 42 (content moderation transparency)
Oversight
Article 11 (regulator’s contact point)
Article 12 (recipient’s contact point)
Article 13 (legal representative)
Article 18 (notification of suspected relevant crimes)
(-)
Article 37 (auditing)
Article 40 (data access/scrutiny)
Article 41 (compliance function)
Content moderation:
All intermediary service providers must comply with DSA Article 14. Hosting service providers must also comply with Articles 16 and 17. Online platforms must also comply with Articles 20-23 and 30-32. And VLOPS and VLOSEs must comply with Articles 34-36.
Article 14 (content moderation): In their terms of service, companies must include information on any policies, procedures, measures, and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system. This information must be set out in clear, user-friendly, and unambiguous language, and must also be made publicly accessible in a machine-readable format.
Article 16 (content moderation): Hosting service providers must put user-friendly mechanisms in place to allow any individual or entity to notify them (electronically) of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Hosting providers must then act on this knowledge to prevent becoming liable for the content. In particular, they must process any such notices and make a decision (possibly using automation) in a timely, diligent, non-arbitrary, and objective manner.
Article 17 (content moderation): If a hosting service provider imposes any restrictions on a user’s account, the provider must provide that user with a clear and specific reason why such restriction was taken. The provider must also provide information on the possibilities for redress available to the user in respect of the decision, such as through an internal complaint-handling mechanism, out-of-court dispute settlement, and/or judicial redress.
Article 20 (content moderation): An online platform must provide its users (and any person who files an Article 16 notice) with a mechanism for lodging complaints with the online platform’s decision to restrict or not restrict a piece of content or a user who posted it. The complaints must be handled not purely on an automated basis.
Article 21 (content moderation): Out-of-court dispute settlement. Users and Article 16 notice filers lodging an Article 20 complaint shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with Article 21(3) in order to resolve disputes relating to those decisions. The settlement body shall NOT have the power to impose a binding settlement of the dispute on the parties, however. The fees charged by the settlement body to the online platform must be reasonable and not in excess of the costs incurred by the body. For the complaint lodger, the dispute settlement must available for free or a nominal fee.
Article 22 (content moderation): Trusted flaggers. Online platforms must allow government-certified “trusted flaggers” to submit Article 16 notices and have those notices prioritized for speedy processing & decision.
Article 23 (content moderation): Online platforms must suspend the accounts of users who, after having been issued a prior warning, continue to frequently provide manifestly illegal content. Additionally, online platforms must suspend the ability of a person to file an Article 16 notice if that person frequently submits unfounded complaints through the Article 16 mechanism. Online platforms must have clearly disclosed policies for when either of the previously mentioned types of suspensions will be implemented.
Article 30 (content moderation): E-commerce seller information. Online platforms that allow allow consumers to buy goods or services online (i.e. to “conclude distance contracts with traders”) must collect and use freely accessible online databases to verify: (a) the name, address, phone number and email address of each trader, (b) a copy of the ID document of the trader as required by Article 3 of Regulation (EU) 910/2014, (c) the payment account details of the trader, (d) where the trader (business) is registered, and (e) a self-certification by the trader committing to only offer products or services that comply with EU law. If a trader fails to provide the information or provides inaccurate, incomplete, or not up-to-date information, then, after warnings, the online platform must suspend the trader’s ability to use the service. Traders must be able to lodge an Article 20/21 complaint if they disagree with their suspension. The online platform must make the information from (a), (d), and (e) available to users of the platform.
Article 31 (content moderation): Online platforms that allow traders to sell goods or services must ensure traders provide basic info about the products or services they are selling, and any trademark or logo of the seller. Platforms also must make reasonable efforts to perform random checks to ensure that the products or services sellers are offering are not illegal.
Article 32 (content moderation): If an online platform that allows traders to sell goods or services learns that an illegal product or service has been offered by a trader to consumers located in the EU, then the platform must notify consumers who purchased the product or service, and the notice must include information about any relevant means of redress.
Article 34 (content moderation): VLOPs and VLOSEs must perform an EU systemic risk assessment annually and before deploying any functionalities likely to have a critical impact on such risks. System risk that should be analyzed include:
(a) the dissemination of illegal content through their services,
(b) any actual or foreseeable negative effects for the exercise of fundamental rights (including Charter Article 1 relating to human dignity, Article 7 relating to respect for private and family life, Article 8 relating to protection of personal data, freedom of expression, freedom of information, the freedom and pluralism of the media, freedom to non-discrimination, Article 24 child rights, and Article 38 high-level consumer protection rights),
(c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security, and
(d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to a person’s physical and mental well-being.
When conducting risk assessments, VLOPs and VLOSEs must take into account their content moderation systems.
Article 35 (content moderation): VLOPs and VLOSEs must put in place reasonable, proportionate, and effective mitigation measures tailored to the specific systemic risks identified in their Article 34 risk assessment(s). Such measures may include:
(c) adapting content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for content moderation;
(g) initiating or adjusting cooperation with trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;
(k) ensuring that an item of information, whether it constitutes a generated or manipulated image, audio, or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their online interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.
Article 36 (content moderation): VLOPs and VLOSEs must comply with crisis response orders from the Commission to whether and to what extent their services contribute to particular threats to public security or public health, identify specific measures to prevent or eliminate such contributions, and report to the Commission on the implementation and effectiveness of such measures.
Fair design:
All intermediary service providers must comply with DSA Article 14. Hosting service providers must also comply with Article 16. Online platforms must also comply with Articles 25-28 and 30-31. And VLOPS and VLOSEs must comply with Articles 38-39.
Article 14 (fair design): Any restrictions imposed by the terms of service on the use of an intermediary service must be applied and enforced in a fair, diligent, objective, and proportionate manner, to all parties.
Article 16 (fair design): Hosting service providers shall process any Article 16 notices in a timely, diligent, non-arbitrary, and objective manner.
Article 25 (fair design): Online platforms must not design, organize, or operate their online interfaces in a way that deceives or manipulates the recipients of the service, or in a way that otherwise materially distorts or impairs the ability of the recipients of the service to make free and informed decisions. This prohibition does not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679. The Commission may issue guidelines on how the prohibition applies to specific practices, including:
Giving more prominence to certain choices when asking the recipient of the service for a decision,
Repeatedly requesting the recipient of the service make a choice (e.g. via a pop-up) where that choice has already been made,
Making the procedure for terminating a service more difficult than subscribing to it.
Article 26 (fair design): Ads on online platforms must be clearly marked as ads, must disclose who is running the ad, and information clearly accessible from the ad about the main parameters used to determine the recipient to whom the ad is presented, and, where applicable, information about how to change those parameters. Online platforms may not present ads based on profiling as defined in Article 4, point (4) of Regulation (EU) 2016/679 using special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679.
Article 27 (fair design): Where several options of recommender system are available under the terms of service of an online platform, that online platform must make available a functionality that allows the recipient of the service to select and modify at any time their preferred option.
Article 28 (fair design): Providers of online platform shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor.
Article 30 (fair design): Traders who are suspended from an e-commerce platform pursuant to Article 30 are entitled to fair treatment if they lodge a complaint about the suspension.
Article 31 (fair design): Providers of online platforms allowing consumers to conclude distance contracts with traders shall ensure that its online interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law.
Article 38 (fair design): VLOPs and VLOSEs that use a recommender system must provide at least one option for each of their recommender systems which is not based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679.
Article 39 (fair design): VLOPs and VLOSEs that present ads on their online interfaces must make publicly available a repository of all the ads run on the platform.
Transparency:
All intermediary service providers must comply with DSA Article 15. Hosting service providers must also comply with Article 17. Online platforms must also comply with Articles 22, 24, and 26. And VLOPS and VLOSEs must comply with Articles 39 and 42.
Article 15 (transparency): Intermediary service providers (unless they are small or micro enterprises and not also VLOPs or VLOSEs) must make publicly available, at least once a year, a report on any content moderation they engaged in.
Article 17 (transparency): Hosting service providers must provide a clear and specific statement of reasons to any recipients of the service affected by restrictions due to illegal or incompatible content.
Article 22 (transparency): Trusted flaggers must publish, at least once a year, a comprehensive and etailed report on notices submitted in accordance with Article 16.
Article 24 (transparency): Online platforms must include in their Article 15 reports information about the number, processing times, and outcomes of disputes submitted to Article 21 out-of-court dispute settlement bodies.
Article 26 (transparency): Online platforms that present ads must clearly mark content as ads and disclose info about who is running the ad and how the ad is targeted.
Article 39 (transparency): VLOPs and VLOSEs must publicly publish (through a human interface and through API) a database of all ads served through their online interfaces, including the content of the ads, the name of the product or service being sold or the brand being advertised, the person (natural or legal) on behalf of whom the ad was run, the person who paid for the ad, the period during which the ad was run, and the parameters used to target the ad.
Article 42 (transparency): VLOPs and VLOSEs must publish Article 15 reports every six months. VLOPs must also include additional information in these reports, including:
The human resources that the VLOP dedicates to content moderation for its EU services, broken down by each applicable official language of the EU member states,
The qualifications and linguistic expertise of the persons carrying out those content moderation services, and
The indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the EU member states.
VLOPs and VLOSEs must also make publicly available (a) a report with the results of their Article 34 risk assessment, (b) the specific mitigation measures they put in place pursuant to Article 35, (c) the audit report required by Article 37, (d) the audit implementation report described in Article 37, and (e) where applicable, info about the consultations conducted by the company in support of the risk assessments and the design of the risk mitigation measures.
Oversight:
All intermediary service providers must comply with DSA Articles 11-13. Hosting service providers must also comply with Article 18. And VLOPS and VLOSEs must comply with Articles 37 and 40-41.
Article 11 (oversight): Providers of intermediary services must designate a single point of contact to enable them to communicate electronically with EU member states’ authorities, the Commission, and the Article 61 Board.
Article 12 (oversight): Providers of intermediary services must designate a single point of contact that recipients of the service can use to directly and rapidly communicate with them electronically, in a manner which is not entirely automated.
Article 13 (oversight): Intermediary service providers serving the EU but not having an establishment in the EU must designate a legal representative in the EU, and this representative shall be liable for non-compliance with the DSA.
Article 18 (oversight): Where a hosting service provider becomes aware of any information giving rise to a suspicion that a criminal offense involving a threat to the life or safety of a person has taken place, is taking place, or is likely to take place, the provider must promptly notify law enforcement.
Article 37 (oversight): VLOPs and VLOSEs must pay for an independent audit at least once a year to assess their compliance with the DSA.
Article 40 (oversight): VLOPs and VLOSEs must, if and as requested, provide the EU government and/or vetted researchers with data necessary to monitor and assess compliance with the DSA.
Article 41 (oversight): VLOPs and VLOSEs must establish a compliance department which is independent from their operational departments and is composed of one or more compliance officers, including the head of the compliance department.
What are the penalties for companies that fail to comply with the Digital Services Act?
Individual EU member states can set their own rules on penalties applicable to infringements of the DSA by providers of intermediary services. However, the states are limited to a maximum fine equal to 6% of the annual worldwide turnover (revenue minus VAT and other taxes) of the intermediary service provider in the preceding financial year.
Additionally, the max fine that may be imposed for supplying incorrect, incomplete, or misleading information, for failing to reply or rectify such information, or failing to submit to an inspection, shall be 1% of the annual income or worldwide turnover of the intermediary service provider in the preceding financial year.
The max amount of any periodic penalty payment shall be 5% of the average daily worldwide turnover or income of the provider of intermediary services for the preceding financial year, per day.
For comparison, the maximum fine under GDPR was 4% of global turnover. That means the DSA has 50% more teeth than the GDPR.
When is the deadline for companies to be in compliance with the DSA to avoid penalties?
The DSA was published in the Official Journal of the European Union on October 27, 2022. The DSA “entered into force” on November 16, 2022. However, this only started the processes to determine which tech companies qualify as “very large”. The actual rules of the DSA do not apply to companies until later.
For most companies, the DSA rules will only apply to companies starting 15 months later — on February 17, 2024. However, VLOPs and VLOSEs have to comply with all obligations in Articles 11-43 of the DSA starting 4 months after being designated as VLOPs or VLOSEs. The first set of companies were given those designations on April 25, 2023, which means Facebook, Google, and other large companies have been liable for non-compliance since August 25, 2023.
TL;DR:
Most companies will need to comply with DSA rules starting on February 17, 2024, but some large tech companies have been subject to the DSA since August 25, 2023.
3 Business Opportunities Created by the Digital Services Act
1. Article 21 case law data & analytics tool
The certified out-of-court dispute settlement bodies described in Article 21 of the DSA are essentially going to be mini courts. These mini courts will generate precedents with each case reviewed, and like with ordinary courts, there will be a need for a software tool that can track and search all of these cases in the settlement bodies’ efforts to make fair and consistent decisions.
I think you could create a mini Lexis Nexis or Westlaw by partnering with all of the organizations that get certified as settlement bodies under the DSA. You’d collect their data, combine it with the data from all the other certified settlement bodies, index the data and make it searchable, and then sell the resulting packaged data tool back to the certified settlement bodies.
2. Article-16-as-a-Service
All intermediary service providers of all sizes must comply with Article 14 of the DSA which requires them to disclose their policies for content moderation and internal complaint handling in both clear, user-friendly natural language and machine-readable language. Additionally, all hosting service providers are subject to Article 16’s requirement to have a mechanism for users and third parties to notify them of illegal content or content that violates terms of service of the hosting service.
I think you could create a SaaS product that is “Article-16-as-a-service” for intermediary service providers. The tool would be configurable to apply to the database of the intermediary service provider and would provide an interface to users and third parties to file and dispute about content. The tool could also generate text the terms of service and machine readable description disclosing the processes and make those generated disclosures available for simple insertion into the service provider’s website.
3. Email newsletter
Article 39 of the DSA requires big companies like Facebook to disclose everyone’s ads publicly. That is bad for advertisers because it means as soon as you start running an ad that works, all your competitors will be able to see that it works and copy it. However, Article 39 only applies to “very large” online platforms and search engines. That means if you advertise on an email newsletter with 1 million subscribers, you don’t have to disclose that ad publicly to your competitors. Email also gives you the option to target your ad to different subsets of your audience, unlike sponsoring a Youtuber (you could of course buy ads on a segment of the audience of a Youtube channel, but then Google would disclose your ad publicly).
All of this is to say, email newsletters are going to be a more attractive advertising channel for European businesses.
You could also start a newsletter ad agency that partners with lots of small to medium size newsletters instead of building a newsletter yourself.
Digital Services Act (DSA) vs Digital Markets Act (DMA)
The EU’s Digital Services Act (DSA) and Digital Markets Act (DMA) are two separate but related pieces of legislation passed in 2022. The DSA is concerned with harmful and illegal goods, services, and content online. In contrast, the DMA is concerned with anti-competitive behavior of internet companies.
The key concepts from the DMA are “core platform service” and “gatekeeper”.
Core platform service means any of the following:
Online intermediation services (not to be confused with “intermediary services” as defined in the DSA, an online intermediation service is defined in Article 2, point (2) of Regulation (EU) 2019/1150 as an information society service that allows business users to offer goods or services to consumers, with the goal of initiating direct transactions between those business users and consumers, with the information society service provided to business users on the basis of contractual relationships.)
Online advertising services, including any advertising networks, ad exchanges and any other ad intermediation services, provided by an undertaking that provides any of the previously mentioned core platform services
A “gatekeeper” is an entity providing core platform services which has been designated as a gatekeeper by the EU Commission.
Digital Markets Act Do’s and Dont’s for Gatekeepers under the DMA
Do
Don’t
– Allow third parties to interoperate with the gatekeeper’s own services in certain specific situations
– Allow their business users to access the data that they generate in their use of the gatekeeper’s platform
– Provide companies advertising on their platform with the tools and information necessary for advertisers and publishers to carry out their own independent verification of their advertisements hosted by the gatekeeper
– Allow their business users to promote their offer and conclude contracts with their customers outside the gatekeeper’s platform
– Treat services and products offered by the gatekeeper itself more favorably in ranking than similar services or products offered by third parties on the gatekeeper’s platform
– Prevent consumers from linking up to businesses outside their platforms
– Prevent users from uninstalling any pre-installed software or app if they wish to
– Track end users outside of the gatekeeper’s core platform service for the purpose of targeted advertising, without effective consent having been granted
Goal of generation 1 laws: Protect free speech and innovation
Goal of generation 2 laws: Protect consumers and competition
Shift from: liability for content, to liability (accountability) for the design of services.
Glossary (EU tech law & regulation)
Content moderation means the activities, whether automated or not, undertaken by the providers of intermediary services, that are aimed, in particular, at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal content or that information, such as demotion, demonetization, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account.
DSA stands for the EU Digital Services Act.
An information society service is any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. The definition is the same as provided in Article 1(1), point (b), of Directive (EU) 2015/1535.
For the purposes of this definition:
‘at a distance’ means the service is provided without the parties being simultaneously present;
‘by electronic means’ means that the service is sent initially and received at its destination by means of electronic equipment for the processing (including digital compression) and storage of data, and entirely transmitted, conveyed, and received by wire, radio, optical, or other electromagnetic means;
‘at the individual request of a recipient of services’ means that the service is provided through the transmission of data on individual request.
Examples of information society services:
eBay
Hotels.com
Google Search
Uber
Non-Examples of information society services:
A taxi driver providing transportation services (service is not provided ‘at a distance’ from the passenger)
A medical examination at a doctor’s office using electronic equipment where the patient is physically present (service is not provided ‘at a distance’).
Telemarketing done by a human (service is not considered to be provided by ‘electronic means’)
Radio broadcasting (service is not supplied ‘at the individual request of a recipient of services’)
Intermediary service means one of the following information society services:
A mere conduit service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;
A caching service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate, and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request; and
A hosting service, consisting of the storage of information provided by, and at the request of, a recipient of the service.
Examples of mere conduit services:
A private messaging app like Facebook Messenger
Examples of caching services:
Content delivery networks (CDNs)
Examples of hosting services:
Cloud storage providers like Dropbox
Cloud computing providers like AWS, GCP, and Azure
An online platform is a provider of hosting services which, at the request of a user, stores and disseminates information to the public (unless that activity is a minor or purely ancillary feature of another service or a functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the Digital Services Act.
Examples of online platforms:
Social media apps (e.g. Facebook, Snapchat)
Online marketplaces (e.g. Amazon, eBay)
Non-Examples of online platforms:
The comment section of an online newspaper
Cloud computing, cloud storage, and web hosting services (i.e. web infrastructural services)
Email service providers
Private messaging apps
A recipient of the service means, with respect to an intermediary service, any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making it accessible.
Recommender system means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritize that information, including as a result of a search initiated by the recipient of the service, or otherwise determining the relative order or prominence of information displayed.
An intermediary service provider is considered to have a substantial connection the the EU if:
The service provider has an establishment in the EU,
The number of consumer & business users of the service in one or more EU member states is significant in relation to the population thereof, or
The service provider targets activities towards one or more EU member states.
Whether a service provider is considered to be targeting activities towards one or more member states is a matter of facts and circumstances, including the following factors:
The use of a language or currency generally used in a member state,
The possibility of residents of a member state ordering products or services,
The use of a relevant top-level domain,
The availability of an app in a relevant national app store,
The provision of local advertising,
The provision of advertising in a language used in a member state, or
Providing customer service in a language used in a member state.
A substantial connection will also be construed to exist if a service provider directs its activities to one or more member states within the meaning of Article 17(1), point (c) of Regulation (EU) No 1215/2012.
However, the mere technical accessibility of a website from the EU cannot, on that ground alone, be considered as establishing a substantial connection to the EU.
VLOP stands for Very Large Online Platform. This is a technical term used in the DSA. A VLOP is an online platform that, for at least 4 consecutive months, provides services to at least 45 million average monthly active users in the EU (corresponding to about 10% of the EU population), and that has been designated by the European Commission as a VLOP.
An online search engine is an intermediary service provider that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject, in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found.
VLOSE stands for Very Large Online Search Engine. This is a technical term used in the DSA. A VLOSE is an online search engine which reaches, on average, at least 45 million monthly active recipients of the service in the EU, and which has been designated as a VLOSE by the European Commission.
Appendix: Video Summary of the Digital Services Act
[Transcript]
TikTok is now legally forbidden from shadow banning people in Europe, thanks to a new law that went into effect last week — the Digital Services Act. This law requires all big tech companies including TikTok and Instagram to disclose to any user when they have been restricted, and to clearly explain why.
The new law also mandates that big tech companies give Europeans the option to opt out of personalized recommendation algorithms, and give them an easy way to change the parameters used to target them with ads.
Perhaps most importantly of all though, the new law requires that big tech companies publish annual reports on how their services might negatively impact people, elections, and society, and to alter their services to mitigate those risks. Independent researchers and government regulators will audit those companies for compliance, and companies can be fined up to 6% of their global revenue if they don’t comply.
If you live in the U.S. and are feeling a bit jealous, you aren’t alone. Several U.S. lawmakers have been pushing for similar regulations in the U.S., although it may take years before anything gets passed.
What Does the Florida Real Estate Sales Associate Exam Cover?
The Florida real estate sales associate exam covers 19 topics. The topics (and their subtopics) are summarized below, along with what percentage of exam questions come from each topic.
1. Real estate brokerage activities and procedures (12%)
Brokerage offices
Advertising
Escrow or trust accounts – general rules
Broker held
Attorney/title company held
Commingling
Broker’s commission
Anti-trust laws
Lien law on real property
Sales Associate Commission
Math-Commission
Kickbacks
Change of employer
Types of business entities that may or may not register
Sole proprietorship
Partnership
Corporation
LLC
Trade names
Unlicensed personal assistants
2. Real estate contracts (12%)
Preparation of contracts
Essentials of a contract
Statute of frauds
Statute of limitations
Transfer of real property
Contract categories
Contract negotiation
Termination of contracts
Contracts important to real estate
Listing contracts
Buyer-broker agreement
Option contracts
Sale and purchase contracts
Mandatory disclosures
Material defects
Radon gas
Lead-based paint
Energy efficiency brochure
Home Owners Association
Property tax
Building code
Community Development District disclosure
3. Residential mortgages (9%)
Legal theories of mortgages
Loan instruments
Mortgage clauses
Types of mortgage loans
FHA
VA
Conventional
Methods of purchasing mortgaged property
Other types of financing
Qualifying buyers
Math-Finance
4. Property rights: estates, tenancies, condominiums, HOAs, and time-sharing (8%)
The nature of property
Physical components
Personal property
General property rights
Estate and tenancies
Fee simple
Life estate
Tenancy at will
Tenancy at sufferance
Tenancy in common
Joint tenancy
Tenancy by the entireties
Homestead
Protection of homestead
Tax exemption
Cooperatives, condominiums, and time sharing
A few confusing phrases which are commonly found in many contracts are “in consideration of the foregoing”, “subject to the foregoing”, and “notwithstanding the foregoing”. I wrote a short article explaining what each of those phrases mean.
5. Real estate appraisal (8%)
Appraisal regulation / USPAP
Market value
Approaches to estimating real property value
Sales comparison approach
Cost-depreciation approach
Income capitalization approach
Comparative Market Analysis (CMA)
Broker Price Opinion (BPO)
6. Authorized relationships, duties, and disclosures (7%)
Law of agency
Brokerage relationships in Florida
Transaction broker
Single agent
No brokerage relationship
Transition to Transaction broker
Misrepresentation and fraud
Professional ethics
Sales associate to broker
7. Titles, deeds, and ownership restrictions (7%)
Title to real property
Acquiring legal title
Voluntary alienation
Involuntary alienation
Types of notice
Condition of title
Deeds
Deed clauses
Statutory deeds
Special purpose deeds
Ownership limitations and restrictions
Easements
Leases
General and specific liens
Public/government restrictions
Deed restrictions
8. License law and qualifications for licensure (6%)
Historical perspective of Florida real estate license law
Statutes and rules important to real estate
General licensing provisions
Sales associate qualifications for licensure
Post-licensing education
Continuing education
Broker requirements
Registration and licensure
Real estate services
Individuals who are exempt from a real estate license
9. Real estate related computations and closing of transactions (6%)
Computations math
Closing statements
10. Legal descriptions of real estate (5%)
Purposes of legal descriptions
Types of legal descriptions
Metes and bounds
Lot and block
Government survey system
Legal descriptions math
11. Types of mortgages and sources of financing (4%)
The mortgage market and money supply
Federal regulatory bodies
Primary mortgage market
Secondary mortgage market
Mortgage fees
12. Violations of license law, penalties and procedures (3%)
Complaint process – seven steps
Violations and penalties
Grounds for denial
Grounds for suspension
Grounds for revocation
Types of penalties
Real estate recovery fund
Legal terms to know
Disciplinary guidelines
13. Federal and state laws pertaining to real estate (3%)
The USDA tracks the average value of U.S. farmland per acre in each state. This value includes both the property of the land itself as well as any buildings and other agricultural structures on the land. Below are the 5 states with the lowest average farmland values as of August 2022.
1. New Mexico ($610 / acre)
Including native American reservations, the poverty rate in New Mexico is almost 20%. That extreme poverty combined with a dry desert climate make New Mexico’s farmland the absolute cheapest in the entire U.S. at just $610 per acre on average.
2. Wyoming ($850 / acre)
Wyoming is the least populated state in the U.S. with just 581,000 residents. That’s less people than live in either Oklahoma City or Memphis, Tennessee, which helps explain why Wyoming farmland is so cheap at just $850 per acre.
3. Montana ($1,030 / acre)
Montana farmland is only worth $1,030 per acre, but there are about 58 million acres of farmland in Montana which is the second most of any state behind Texas. That means Montana’s farmland in total is worth $60 billion.
4. Nevada ($1,060 / acre)
Colorado may be dry, but Nevada is even drier. Nevada only gets about 10 inches of rainfall every year, and the vast majority of the state is just desert. Farmland in Nevada costs just $1,060 per acre.
5. Colorado ($1,770 / acre)
The average annual precipitation in Colorado is only 18 inches which is less than half of the global average. That extreme dryness, combined with extremely high elevations and thin air, means that Colorado farmland is not very productive and therefore not very valuable, costing just $1,770 per acre on average.