Complaints & Content Removal Policy (including Depicted‑Person Appeals)
Last Updated: July 31st 2025
This Complaints & Content Removal Policy (the “Policy”) is part of your agreement with Finsta (operated by Solutionize LTD (UK) and Solutionize LLC (USA), together “Finsta,” “we,” “us,” or “our”). By using Finsta or submitting a complaint under this Policy, you confirm the information you provide is truthful and accurate and you agree to this Policy.
Finsta is a privacy‑first, app‑based social platform connecting users with verified creators for premium one‑on‑one messaging, voice/video calls, and digital products in a closed, secure ecosystem. We enforce strict age and identity verification for both creators and consumers and maintain comprehensive content‑safety, consent‑tracking, and fraud‑prevention systems.
Part A — General Complaints & Content Removal
A1. Scope
This Part explains how we handle complaints about content or conduct on Finsta that may be:
-
Illegal; and/or
-
In violation of our T&C’s
A2. Our Commitments
-
You may report content or conduct that may be illegal or violates our Standards via the channels below.
-
Every complaint is reviewed and resolved within seven (7) business days.
-
We will acknowledge receipt, provide a ticket ID, and notify you of the outcome.
-
We maintain a fair, transparent appeals process (see A6 and Part B for depicted‑person appeals).
-
We take proportionate action and keep records supporting our decisions.
Business days mean Monday–Friday, excluding public holidays in England & Wales.
A3. How to Report
You can submit a complaint using any of the following:
-
In‑App Report: Tap the ⋯ (three dots) menu on the relevant profile, message, media, or call record → Report
-
Email: admin@finsta-app.com (general complaints)
-
Web Form: finsta-app.com/report
-
Support (24/7 WhatsApp): +44 1904 500 751
Postal:
-
Solutionize LTD (UK): 85 Great Portland Street, W1W 7LT, LONDON UK
-
Solutionize LLC (USA): 1111b South Governors Avenue Dover, DE 19904 US
-
-
Optional (CCBill): If you prefer, you may also use the CCBill complaints page: Here.
Helpful details to include: what the issue is and why it’s illegal or violates our Standards; where it appears in‑app (profile handle, chat/thread, media ID, timestamps); date/time observed; any consent information known to you (e.g., you are depicted and did not consent); and your contact email. If key details are missing, we may request more information to resolve the matter.
A4. Review & Resolution Process (7 Business Days)
-
Receipt & Acknowledgment (Day 0–1): We assign a ticket ID, acknowledge via in‑app message or email, and preserve relevant evidence.
-
Triage & Interim Safety (Day 0–2): Imminent‑harm risks (e.g., credible threats, child safety) receive immediate action, which may include temporary restrictions, content hiding, or account holds.
-
Investigation (Day 0–5): A trained Safety reviewer evaluates: the reported material and context; consent & age verification signals (creator verification, model releases/attestations, hash‑matching for known NCII/CSAM, fraud indicators); prior trust & safety history; and applicable law/jurisdiction.
-
Decision & Action (By Day 7): We decide whether the content/conduct violates law and/or Standards and apply proportionate measures (see A5). We notify both the complainant and the affected user (where lawful and safe) of the outcome and next steps.
Child Safety & Exploitation: We have zero tolerance. Suspected CSAM, trafficking, or exploitation is removed immediately and reported to appropriate authorities (e.g., NCMEC/IWF or competent local authority). Related accounts are suspended pending outcome.
A5. Potential Outcomes
Depending on severity, intent, and history, actions may include:
-
No Violation: No action; we explain our reasoning.
-
Labels/Age‑Gating: Warning screens, sensitive‑content gates, age restrictions.
-
Content Removal/Restrictions: Deletion of media/messages/listings; reduced visibility; geo‑blocking where appropriate.
-
Feature Limits: Temporary or permanent limits on messaging, calls, media uploads, live streaming, or sales.
-
Account Actions: Warnings/strikes, temporary suspension, permanent termination, device/payment‑instrument bans.
-
Monetization & Funds: Holds or forfeiture of payouts for violating content or fraud; reversal of unlawfully obtained earnings; refunds where required by law or card‑brand rules.
-
Reporting: Notifications to payment partners, platforms, or law enforcement where legally required or prudent.
-
Remediation: Required policy education, re‑verification, or consent attestation before restoration of features.
A6. Appeals (General)
If you disagree with a decision under Part A:
-
Who may appeal: the complainant (if we decline to act) or the affected user/creator (if we act on content/account).
-
How: Reply to the outcome notice or email appeals@finsta.app with your ticket ID and reasons.
-
When: Submit within 14 calendar days of our decision. We generally complete appeals within seven (7) business days.
-
Who reviews: A senior Safety reviewer not involved in the original decision. The reviewer may consult legal/safety specialists and will issue a final internal decision with reasons (subject to Part B for depicted‑person consent matters).
Part B — Depicted‑Person Appeals (Consent‑Based Takedowns)
B1. Who can use this process
Any person depicted in content on Finsta may request removal under this Part if they did not consent to the content or if their consent is void under applicable law (e.g., due to age, coercion, fraud, withdrawal of consent where required, or other legal defect).
B2. How to file a depicted‑person appeal
-
Email admin@finsta-app.com (preferred) or use the in‑app Report tool and select Depicted‑Person/Consent.
-
Provide: a description of the content; where it appears (profile handle, media ID, timestamps); that you are the person depicted; the basis on which consent was not given or is void; and any supporting evidence (e.g., proof of identity, model release status, prior communications).
B3. Review standard & timing
-
We will acknowledge your appeal and resolve it within seven (7) business days.
-
We will prioritize imminent‑harm risks and hide/restrict content during review where appropriate.
-
If consent is confirmed not given or void, Finsta will remove the content and take proportionate additional action where warranted (see A5).
B4. Disagreements & neutral resolution
If there is a disagreement about a depicted‑person appeal, we will allow the disagreement to be resolved by a neutral body (e.g., an accredited dispute‑resolution provider or other mutually acceptable neutral). We will cooperate in good faith with the neutral process and implement the neutral body’s final determination.
B5. Notice to affected creator/user
Where lawful and safe, we will notify the affected user/creator of the request and decision, while protecting the reporter’s safety and privacy as appropriate.
Part C — Privacy, Due Process & Data Handling
-
We balance reporter privacy with fair notice to affected users. We may withhold reporter identity where safety or law requires, while providing enough detail for a meaningful response.
-
We preserve evidence relevant to investigations and comply with lawful requests from authorities.
-
We process personal data as described in our Privacy Policy, including cross‑border transfers and retention consistent with legal obligations and platform safety.
Part D — Regulatory & Jurisdictional Information
-
EU/EEA: You may also seek out‑of‑court dispute settlement or contact your Digital Services Coordinator under applicable law. You may still report to us and appeal internally under this Policy.
-
UK: Nothing in this Policy affects your rights under UK consumer law or applicable online‑safety regulations.
-
Other regions: You may have additional rights under local law. Where required, we will localize this Policy or provide region‑specific terms.
Part E — Transparency & Changes
-
We maintain internal logs of complaints, actions taken, and resolution timelines. We may publish anonymized transparency summaries (e.g., volume of reports, action rates, average resolution time).
-
We may update this Policy to reflect operational, legal, or regulatory changes. We will notify users of material updates in‑app or by email and indicate a new Effective date above.
Contact
-
General & Safety: admin@finsta-app.com
-
Legal & Urgent Safety: admin@finsta-app.com
-
Appeals: admin@finsta-app.com | (24/7 WhatsApp): +44 1904 500 751
-
Postal:
-
Solutionize LTD (UK): 85 Great Portland Street, W1W 7LT, LONDON UK
-
Solutionize LLC (USA): 1111b South Governors Avenue Dover, DE 19904 US
-
Your statutory rights are not affected.
This Complaints & Content Removal Policy (the “Policy”) forms part of your agreement with Finsta (operated by Solutionize LTD (UK) and Solutionize LLC (USA), together “Finsta,” “we,” “us,” or “our”). By using Finsta, you agree to this Policy. By submitting a complaint, you confirm that the information you provide is truthful and accurate.
Finsta is a privacy‑first, app‑based social platform enabling verified creators to offer premium one‑on‑one messaging, voice and video calls, and digital products within a closed, secure ecosystem. We enforce strict age and identity verification for both creators and consumers, and we maintain comprehensive content‑safety, consent‑tracking, and fraud‑prevention systems.